The Canvas line of Flash Cards, available in SD & Micro SD formats. Courtesy of Kingston Technologies.
Article by Marcus Siu
Before the days of HD video becoming popular and when point and shoot cameras were the main reason why consumers bought SD cards, memory card companies designated speed classes with numbers 1-10 on them. If a card was rated a “4”, then it meant 4 MB per second. If it was rated a “10”, then it meant 10 MB per second.
At that time, “ten” was the absolute best.
Then hi-speed camera, HD video, and smartphones came into the picture and the speed class system seemed to be somewhat outdated since it is now very difficult to find a card that is less than a “ten”. It seems like they have been extinct, except those sold at the Dollar Store. Therefore another type of classification was introduced onto the flash cards; the UHS-I Speed class, which stands for “Ultra High Speed”.
Now that pretty much all the SD cards are rated a “ten” and “HD” has gone mainstream, the question comes up… how much speed do you need from an SD card for your device?
Currently there is a UHS-I Speed 1, which is perfect for full HD videos and 3D video, and UHS-I Speed 3, which is for 4K, full 1080p HD video and burst-mode action photography.
With so many different brands and product lines, it has always been confusing to know which card to choose and to know how fast an SD card you really need. Now with their “Canvas” line that was just introduced in March, Kingston Technology has made it so much easier for the average consumer to buy SD cards in the marketplace.
The Canvas series comes in SD or Micro SD formats with the “React”, “Go!”, and “Select” on display at Pepcom’s “Digital Experience” in San Francisco. Photo by Marcus Siu.
The new “Canvas” line (from top to bottom) is “Canvas React” (in gold lettering), “Canvas Go” (in silver lettering), and “Canvas Select” (in black lettering).
In other words, “great”, “even greater” and “absolute best”.
As a photographer and videographer, it is absolutely crucial to choose the right SD card for your camera equipment. As I’m shooting in HD video and bursting photo shots at a moving subject with my camera, I need to find an appropriate card that writes as fast as my camera shoots, so I would be ensured of getting all the footage that I intended without any unpleasant surprises.
If your SD card doesn’t have a high enough speed, it will stop recording video since the camera will pause and the video clip will abruptly end due to the slower speed of the SD card and the faster speed of the camera. This has happened to me on many occasions and I was blaming the camera for it at the time. Now I know better about the technical limitations of SD cards.
For people like me, who use high end DSLR’s, mirrorless cameras and 4K video, then Patriot’s “Canvas React” is optimum, with it’s amazing 100 mb read speed, and even more impressive 80 mb write speed. It’s amazing that it almost writes as fast as it reads.
However, if I am shopping for an SD card just for my mom, who just takes pictures from her Kodak point and shoot camera, so there is no need to get the absolute best for her $59 economy camera. In her case, the “Canvas Select” would be the appropriate choice.
Jensen Huang, CEO of Nvidia, addresses the capacity crowd at the San Jose Convention Center at GTC 2018. Photo by Marcus Siu.
Article and photos by Marcus Siu
SAN JOSE, MARCH 24, 2017 – Over 8,500 people attended GTC 2018 last month at the San Jose Convention Center to attend over 600 hours of sessions, with 400 of those them devoted exclusively to artificial intelligence (A.I.) The GTC conference is also known to have one of the best VR/AR conferences in the world for all industries, including hundreds of startups.
The list of attendees attending GTC is dazzling. All of the top technology companies, film companies, game companies, auto companies, smart city governments, medical universities, among others, all attended the conference. The way attendance has been dramatically growing each year, Nvidia may need to reconsider moving to a bigger venue in the near future.
After a very impressive “I am A.I” trailer introduction demonstrating the various ways Nvidia continues to shape the world with deep learning and A.I. with their collaborative partners, Nvidia’s founder and CEO, Jensen Huang took center stage to open the keynote address to the packed audience. He organized his agenda into four main topics including “amazing” graphics, “amazing” science, “amazing” A.I, and “amazing” robots.
It seemed appropriate Huang opened up his keynote with the subject of ray tracing as part of his “amazing graphics” topic since it was announced a week prior to GTC at the Games Developer’s Conference (GDC) in San Francisco. With the demands of A/R and V/R in recent years, Huang acknowledged this had opened up many avenues and opportunities in GPU technology which eventually progressed, with the assistance of deep learning, computer graphics with real time ray tracing.
“Ray tracing…is the Holy Grail, the dream for computer scientists for the last 40 years”. Photo by Marcus Siu.
As Huang was presenting his keynote address, there was a huge image of a desk in a modernly lit room background on the giant screen behind him.
“Computer graphics is the driving force of the GPU. It is computationally insatiable, recreating virtual reality in its daunting computer tasks we know”, Huang remarked. “The computer graphics industry all around the world have been pursuing this holy grail, this dream of creating photo realistic images. ”
Huang continued , “it takes thousands and thousands of CPU’s and servers in order to calculate and compute each one of these frames. One CPU would take hours to compute one frame, and a movie…has hundreds of thousands of frames before they can create the final film.”
Taking a closer look of the image of the modernly lit room behind Huang, the image turned out to be a beautifully composed computer graphic photo image rendered by developers. Its quite uncanny how realistic this is, since the image included several objects on the desk, including a crystal glass, rubix cube, storm trooper figurine, Newton’s cradle of silver balls, a mini mirror, and a few colored gummy bears scattered across the table, among an assortment of other items. Each of these items showcased how challenging it is for a computer graphics scientist to render, yet if done well, would be completely indistinguishable from the real thing.
Huang pointed out the beauty and realism of the rendered objects on the desk, commenting on what it takes to make the image as photo realistic as possible with Nvidia’s ray tracing techniques to trick the viewers eyes, such as using screen-space ambient occlusion, baked lighting, global illumination, space screen reflections and environment maps, screen-space refraction, caustics, depth sorting, subsurface scanning approximation and subsurface scanning, which made the gummy bears come to life, for example.
“Using ray tracing, the gummy bear looks like you can just pick it up and eat it”, remarked Huang .
Jedi mind tricks believe you will.
“All of those computational and intensive problems; the effect of light as it travels through your room, the environment, is so hard to compute and that is why ray tracing has become so popular in film and it is the holy grail; the dream of computer scientists for the last forty years”, Huang remarked.
Most of this work will take many hours to render a frame for even the most experienced graphic designers. With the gaming and film industry, the main recipients who will benefit most from the latest RTX technology, it makes it so much easier to compose a photo realistic image for them now as a developer without a supercomputer.
“This is what we can do now; a $68,000 computer vs a super computer.”
Jensen Huang demonstrates real time ray tracing at GTC 2018. Photo by Marcus Siu.
Steve Parker, an engineer and part of the Nvidia ray tracing team has been working on ray tracing for his entire career since joining them ten years ago, demonstrated along with Huang, an entire Star Wars demo in real time with just one Nvidia DGX station with four Voltas in real time, instead of a super computer rendering these scenes one frame every ten hours, like in a Hollywood movie, such as “The Jungle Book”.
“From tools makers (Adobe, AutoDesk), engine makers (Epic, Unity), film studios (Pixar, Industrial Light & Magic); they’ve all come out to adopt this technology. The Nvidia RTX technology will revolutionize the way they do work. They can finally do ray tracing in real time; try more frames, create more beautiful shots, deliver to the customer faster and on time and most importantly, save millions of dollars in the process.”
“This is technology is the single most important advance in computer graphics in the last 15 years…I believe that Nvidia RTX is going to define the future of computer graphics “This is a very very big deal”, Huang remarked.
REVOLUTIONARY PRODUCTS
“After 10 years, what makes this special, is that for the very first time we can bring real time ray tracing to the market. People can actually use it. The technology has been encapsulated into multiple layers from our GPU architecture to the algorithms that make it possible for us to do this.”
“You are also seeing deep learning in action. Without deep learning, it would impossible to have traced all of those rays…whereas deep learning has been used in the past for super resolution, we’re been using them for super rays; predicting rays, so that we could fill in the spots that we know what the right answer is going to be using A.I.”
“Nvidia’s Volta GPU, the RTX technology, the solvers, the architectures, the library’s have now been integrated into three of the most important rendering API’s; the Nvidia Optics, Microsoft’s DX-12 extension called DX ray tracing DXR, and also available in open GL Vulkan.”
Huang offcially announced the NVIDIA Quadro GV100 GPU with NVIDIA RTX technology, delivering for the first time real-time ray tracing to millions of artists and designers.
Nvidia estimates that a total of one billion images are rendered each year by the graphics industry. That includes images from 400 gaming products , 500 movies from media and entertainment, 12 million product designers, and 150,000 architects. With the use of the Quadro GV100 GPU with the RTX technology, Huang believes that the number of rendered images will jump to a factor of ten.
NVIDIA CEO Jensen Huang introduces the Quadro GV100 with Nvidia RTX technology technology at GTC 2018, delivering for the first time real-time ray tracing to millions of artists and designers. Photo by Marcus Siu.
Huang also announced some new upgrades and products during the conference, which include doubling the memory on the NVIDIA Tesla V100, as well as introducing the new Nvidia NV Switch, enabling 16 Tesla V100 GPU’s to communicate at a record speed of 2.4 terabytes per second.
“The world’s first workstation GPU based on the Volta architecture. It’s also the first one that has a brand new interconnect between GPU’s called NVlink2; super high speed interconnect between two GPU’s that basically extends the programming (memory) model out of our GPU into the other GPU, which means all of the memory reads and writes and the autonomic’s work exactly the same. Software doesn’t have to change. The two GPU’s connected through this new interconnect called NVLink is essentially one giant GPU. So these two GPU’s working together; two GV100’s, will become a revolutionary new workstation. It’s going to be available from HP, Dell & Lenovo.”
The specs of the this new “workstation” has 64GB of HBM2 Memory, 10,240 CUDA Cores, 236 teraflops of tensor cores.
Nvidia founder and CEO introduces the DGX-2, the largest CPU ever created that can replace a typical render farm of 280 Dual-CPU Servers consuming 168 kW with 14 Quad GPU Servers using 24 kW. Photo by Marcus Siu
Huang also announced the DGX-2, which isthe first single server capable of delivering two petaflops of computational power, 512GB of HBM2 memory, 81,920 CUDA cores, and 2,000 teraflops of tensor cores. The DGX-2 has the deep learning processing power of 300 servers occupying 15 racks of data- center space, while being 60x smaller and 18x more power efficient.
To Joe Moviemaker, it means with the flip of a switch, it can transfer 14,400 ten gigabyte movies in one second.
For filmmakers, game developers, product designers, architects and anyone in the profession who has a need to render computer graphic images, Nvidia has begun a new evolution now that the holy grail has now been discovered and the universe is now endless.
This means that for all computer scientists, the force will be with them. Always.
Nvidia CEO and founder, Jensen Huang, interviews with Jim Cramer of “Mad Money” on the Nvidia Expo floor at the San Jose Convention Center. Photo by Marcus Siu.
SAN FRANCISCO, MARCH 25, 2018. At this year’s GDC 2018 (Game Developer’s Conference), there were a couple of demos that stood out that strongly reinforced my thoughts about the eventual futuristic merging of computer gaming and movies.
After just a few demos on the GDC expo floor, I was not be able to discern the difference between the computer game graphics and the live action from movies anymore. Realism for game developers have never looked so real.
On the first day of GDC, NVIDIA announced their latest RTX technology of Real time-Cinematic rendering. Nvidia’s RTX technology, alongside Microsoft’s new DirectX® Raytracing (DXR) API, has been an intensive work-in-progress for the last ten years. It’s ray tracing renders lifelike and realistic lighting, along with reflections and shadows that make it nearly impossible to distinguish what is real and what is not, in terms of computer graphics. It brings real-time, cinematic-quality rendering to content creators and game developers.
Yes, you read right: “real time”.
This new technology is a milestone for not just gaming developers and filmmakers, but any creator who needs to render an object as realistic as can be. The computer graphics of tomorrow will make computer graphics of today look like a lifeless imitation.
Nvidia had Project Spotlight “Reflections” at their booth at GDC, which looks more like a teaser trailer for the next Star Wars feature film, but in reality, it’s a real time tracing demo, which Epic’s team along with ILMxLAB and NVIDIA’s DGX Station, equipped with four Tesla V100 GPUs, Epic’s Unreal Engine and NVIDIA’s RTX ray-tracing technology.
It definitely wowed the enthusiastic GDC crowd as it was hard to believe that there were no actual actors used in the storm troopers costumes. It was all computer graphics being rendered.
NVIDIA has also announced that the GameWorks SDK will add a ray-tracing denoiser module, helping game developers take advantage of new capabilities. This updated SDK, which is coming soon, includes support for ray-traced area light shadows, glossy reflections and ambient occlusion. This will help save a huge amount of time for creators.
At GDC, game developers will have access to ray-tracing denoiser module, part of the GameWorks SDK from Nvidia. Photo by Marcus Siu.
Imagine how the Screen Actor’s will feel after they realize that computer graphics characters may jeopardize their career in the near future. One of those actors will
Motion capture performance artist, Andy Serkis, known for playing “Gollum” in “The Lord of the Rings” trilogy and “Caesar” in the “Planet of the Apes” trilogy may not have problems finding work. Just a few sections away from the Nvidia booth, I saw a computer generated “digital” Andy Serkis acting out his lines as his alien creature character was being rendered in real time at the Unreal booth.
Unreal indeed.
Just unreal – Andy Serkis’s character talks as Andy talks in real time at the GDC 2018. Photo by Marcus Siu
Maybe in the near distant future, we can just get rid of the entire Screen Actor’s Guild with the exception of performance capture or voice-over actors. Or perhaps one day we can just clone the actors so there wouldn’t be any need for them to come in and perform on a sound stage.
Imagine at the Academy Awards…, “and the nominees for “Best Clone Actor in Supporting Role are”…
In addition with the progress of computer graphics coming from ray tracing in the visual sides, audio will also be just as important for content creators. End user consumers are continuously looking for an immersive experience with their gaming, so many are reaching out to THX certified equipment for their PC gaming.
THX demonstrated their spatial audio platform, using the latest audio standard, MPEG-H, as well as UHD. They were using the game trailer for “Helblade”, which ironically, was one of the first live motion capture technology for Epic Games in 2006. It was in a way, ahead of its time. Coming from a 2.1 THX certified Logitech speakers and a sub-woofer, it was quite sonically immersive. I felt that I was right in the middle of the soundscape.
There was also a demo of it using headphones, but I still preferred the speakers.
Certified THX Logitech Speakers at GDC. Photo by Marcus Siu.
In addition, they utilize personalized audio profiles using HRTS’s (Head-Related Transfer Functions), which are optimized and customized for each listener, based on user’s unique hearing anatomy.
Long associated with the Lucasfilm movie sound in movie theatres in the 80’s, it seems that THX is staging a comeback into the public eye in the marketplace. In addition to its traditional THX certified products such as home entertainment products over the years, such as projectors, pre-amps, receivers and speakers, they have been gaining momentum into the gaming world by introducing THX certified products, such as laptops, headsets and satellite speakers.
Frances McDormand as “Mildred Hayes” in THREE BILLBOARDS OUTSIDE EBBING, MISSOURI.
Article by Marcus Siu
It seems like Deja Vu all over again, with “The Shape of Water” landing 13 nominations this year, which was the same exact amount that “La La Land” landed a year ago, but failed to win Best Picture. I predicted “La La Land” to win, and after a year was still trying to analyze why it lost. My conclusion was because it was too much of a “fantasy” movie. After all, it’s a movie about “dreamers”.
Historically, the Academy has never voted for a science fiction film for Best Picture, nor a “fantasy” film, with the exception for the “Lord of the Rings – Return of the King”. You would think something like “2001: A Space Odyssey” could have won it for Best Picture, but it wasn’t even nominated that year. The Academy tends now to vote for films that have a social message with serious subjects and issues rather than just pure spectacle and escapism, which is probably why “Moonlight” won Best Picture over “La La Land”.
Gone are the days when all you had to do was go by the most number of Academy Award nominations and you be able to predict your Best Picture winner. For example, giant epic Hollywood movies blockbusters such as “Gone with the Wind”, “Lawrence of Arabia”, “Titanic”, etc, would receive ten to thirteen nominations each and end up getting the Best Picture Oscar 90% of the time, but this is no longer the case.
Now with the more “diversity” that recently got recruited as Academy members over the last several years, it’s no longer about pure spectacle and studio production budgets that scores points and connects with the Academy. It’s more about an original script that dares to be different from the ordinary, along with the passion, commitment and performances from all its cast members and direction. It’s all about filmmakers who aren’t afraid to take studio risks to tell a story that needs to be told.
In fact, over the years, it seemed like movies that critics liked differed from what the Academy’s Best Picture choices were. They were never the same. Critics were always more into story driven dialogue films, whereas the Academy was more into production driven ones. However, the gap has become smaller and smaller, as many of these small indie budget films with great scripts and casts are winning the Oscar for movies such as “Spotlight”, “Moonlight”, “The Artist” or even “Birdman”. These films were made because of the filmmakers believed in them, even if the studio didn’t. These movies resonated with the Academy, so it was no longer required to have a big studio budget to win the Best Picture Oscar.
Look for “Three Billboards outside Ebbing Missouri” to edge out “The Shape of Water”. “Billboards” has everything going for it including a solid supporting cast that couldn’t have been better, and a witty screenplay by one of the greatest playwrights in the world, Martin McDonagh, with a social message that the Academy finds mandatory for “Best Picture” material.
“The Shape of Water”, with its 13 Oscar nominations would normally have been the indicator to win Best Picture, especially when the runner up film “Dunkirk” had “only” eight nods. I still don’t believe the Academy will vote for a movie about a romance between a mute woman and a fish man, though I wouldn’t be surprised if it won. My theory still stands, academy voters tend to vote for human dramas about real people, instead of fantasy films. Fantasy films just don’t seem to survive for the Academy. It’s like a fish out of water.
Another possible upset for the Best Picture trophy is “Get Out”, which is definitely a timely social-political film about racism that surprised many when its nominations were announced. However, the movie is too much of a psychological horror story that doesn’t fit the bill for a Best Picture winner, either. It has to be on the top of the heap for pulling an upset if “Three Billboards” or “The Shape of Water” doesn’t win, with Jordan Peele, getting a Best Director nomination, along with his lead actor, Daniel Kaluuya, as well as a Best Original Screenplay.
Historically, the movie that gets Best Picture gets the Best Director Oscar, as well, but this may not the case this year, if “Three Billboards” gets Best Picture, since its director wasn’t even nominated. This time the Oscar will most likely go to director, Guillermo del Toro for “The Shape of Water”. He has already taken the DGA and BAFTA awards, so he should take home the Best Director Oscar hands down.
Gary Oldman should be a shoo-in to win Best Actor for playing Winston Churchill in “The Darkest Hour”. He deserves it, not because he had to sit through four hours of makeup every day, but because he stayed in it all the time on set. In addition, his performance was way beyond extraordinary. Over the years he has been a reliable chameleon, usually unrecognizable, and still continued to put up great performances role after role. However, this is the performance may be the one that he will be remembered as his crowning achievement and his the best ever.
“Three Billboards” will probably take home statuettes for Frances McDormand and Sam Rockwell, as they have seemed to have swept all the other major awards in this dramedy. Like Oldman, it’s great to see Rockwell finally being recognized and paying his dues as a character actor, as well. Let’s see if he will be getting more “leading” roles in the future after he wins his trophy.
Look for Allison Janney to take the Supporting Actress Oscar as Tonya Harding’s mother in “I, Tonya”, as the character she plays was written directly for her. Writer Steven Rogers, a long-time friend of Janney, wrote the role of LaVona Golden with her in mind. Janney also had to find the right bird to perch on her shoulder, as she is not a “bird” person. She nailed the performance and has taken every single major award.
LaVona Golden (Allison-Janney) and her pet bird in I, TONYA, courtesy of NEON
For Best Original Screenplay, it’s between “Billboards”, Martin McDonagh and “Lady Bird’s” Greta Gerwig. The Academy will most likely recognize Greta Gerwig since she was also nominated for Best Director for “Lady Bird”, in addition to getting Oscar nods for her lead actors, Saoirse Ronan and Laurie Metcalf.
James Ivory should take Best Adapted Screenplay for “Call Me By Your Name”, as there really is no serious competition, as “Mudbound” or “Molly’s Game”, is a distant second and third. If he receives the awards, he will be the oldest Oscar recipient at 89, unless Agnes Varda gets the “Best Documentary” the same night.
Look for Pixar’s “Coco” to win another two Oscars and win Best Animated film, along with its Best Original Song “Remember Me”. The other competition is the song, “This is Me”, from “The Greatest Showman” which was a big commercial hit on the radio that can beat Coco, but “Remember Me” was very integral to the movie and not merely an end credit song.
Agnès Varda (left) and JR (right) in Faces Places directed by Agnès Varda and JR. Photo courtesy of Cohen Media Group
In the Documentary category, all the films have equally powerful subjects. It’s just Varda’s buddy-buddy, “Faces Places” stand out because it is delightful to watch, though difficult to categorize as your traditional documentary. Though, “Icarus”, a film about Russian doping on their Olympic athletes and “The Last Men in Aleppo” from Syria, and Steve James’ “Abacus” really caught my attention and definitely deserve the golden statuette, as well.
In the animated shorts categories, it would be hard to not want to see your favorite local hometown L.A. Laker, Kobe Bryant, accept a speech for “Dear Basketball”. It’s a little unfair for the other nominees, since the Academy from would want to reward their sports town hero a nice little retirement gift as a token of appreciation. To be fair, it is quite a terrific short film with John Williams providing the musical score. It should be a slam dunk for the film, but I hope to see “Garden Party” frogs take the Oscar.
Saoirse Ronan and Greta Gerwig on the set of LADY BIRD
In the Live Action shorts, the Australian light-hearted comedy gets the edge, as it is the only one that we can laugh at. The other shorts are heart-wrenching dramas about racial and social situations regarding death and violence, with the exception of “The Silent Child”, which gets my vote as a runner-up. They are all solid, but after watching them, you need a sense of relief.
AND THE REST OF THEM:
Best Picture: Three Billboards outside Ebbing, Missouri
Actor in a Leading Role: Gary Oldman, Darkest Hour
Actress in a Leading Role: Frances McDormand, Three Billboards outside Ebbing, Missouri
Actor in a Supporting Role: Sam Rockwell, Three Billboards outside Ebbing, Missouri
Actress in a Supporting Role: Allison Janney, I, Tonya
Directing: The Shape of Water – Guillermo del Toro –
Adapted Screenplay: Call Me by Your Name – James Ivory
Writing (Original Screenplay) – Lady Bird – Greta Gerwig
Cinematography: Blade Runner 2049 – Roger A. Deakins
Costume Design: Phantom Thread – Mark Bridges
Sound Mixing – Baby Driver – Paul Machliss and Jonathan Amos
Film Editing: Dunkirk – Lee Smith
Sound Editing: Dunkirk – Richard King and Alex Gibson
Visual Effects: War For The Planet of the Apes – Joe Letteri, Daniel Barrett, Dan Lemmon and Joel Whist
Makeup and Hairstyling: Darkest Hour – Kazuhiro Tsuji, David Malinowski and Lucy Sibbick
Music (Original Song) – Coco – Remember Me
Music (Original Score) – The Shape of Water – Alexandre Desplat
The impressive entrance to the LG OLED Canyon at CES2018 in Las Vegas. Photo by Marcus Siu.
Article and photos by Marcus Siu
Walking through breathtaking ice glaciers and magnificent waterfalls surrounded with its natural sounds of blistery winds, thunder and rain was quite a spectacular experience. I was not in Antarctica, but at the CES2018 in Las Vegas; the largest consumer electronic show in the world. At the LG booth in Central Hall, convention goers were led into the main entrance of the LG OLED Canyon, with its 92 foot long installation of 246 open frame OLED displays in both convex and concave configurations that also featured Dolby Atmos making it even more sonically immersive. The demonstration had me winding and circling back wanting more.
Even a month later, I’m still not quite over it.
The demo of the OLED Canyon was not exclusively a demonstration of LG’s products, but also can be thought up as a legitimate customized work of art. The way it was aesthetically sculpted and designed, including its state-of-the-art audio-visuals featuring Dolby Vision, made it a memorable piece of art installation. Anyone can enjoy and appreciate it at an art museum. LG made its point that TV displays can serve another purpose besides showing your favorite TV shows and movies; showcasing art.
Now instead of having blank screens on our TV’s, we can change it to what suits our lifestyle and give it life so it blends into our living space. This is the idea that many of the companies had who displayed their latest UHD-TV’s.
The LG OLED Wallpaper TV’s are meant to blend into your living space reflecting your own personality and surroundings. Photo by Marcus Siu.Walking through the floors at this year’s CES, it became evident that the three top manufacturers of 2016 with the highest global TV market share was also out selling their idea that TV’s should showcase ones individual lifestyle and taste in art. Samsung, LG, and TCL all feature some type of slide show that will feature and display various types of art on their 4k UHD TV’s.
For example, LG introduced Gallery OLED TV in 2014, the 55″ screen display was bordered in an aesthetically pleasing frame, delivering a refined artistic appearance. When equipped with Gallery Mode, it enabled users to view high-resolution digital images of paintings by legendary artists such as Vincent van Gogh and Paul Gauguin. Users can simply connect the devices to display classical paintings, transforming the space into an art gallery.
Samsung’s “Art Mode”, which makes their 4K UHD displays in a wooden frame using a white matte around the art work. You cannot tell it is a set-top box display unless you see a cord hanging from it. Samsung’s “Lifestyle” TV looks like artwork you would find at an art gallery until you turn the power on.
Even Chinese manufacturer, Skyworth’s W8 TV has an exclusive W8 art screen saver, becoming a painting or a mural with a fusion of art and design, so that the user can enjoy their own individual taste in art and blend it into their living room.
Chinese set-top box manufacturer Skyworth follows the idea of LG’s design. Photo by Marcus Siu.
Think of it as sort of a slideshow screensaver, like the ones on your computer, but with the ultra 4k resolution and the brilliant colors of the OLED or QLED which make you wonder if you really do have the actual painting on your wall, though I assure you that the insurance wouldn’t be as high if you really did have it on your wall.
With the latest technology, TV’s are getting lighter and thinner and TV manufacturers are starting to produce huge “wall” screens that weighs no heaver than a framed painting, so the consumer no longer needs to have a TV mount attached to his wall. Now the display just needs to be put up onto the wall, just like a picture frame, so a couple of nails would probably do the trick.
TV will disappear as we know it. Just call them smart connected picture frames. Photography by Marcus Siu.
To summarize, todays 4k UHD TV’s are pretty much picture frames that you hang on your wall. The TV as we know it will probably need to be renamed in a few years and be called something like a “smart-connected picture frame”. Picture that…
Imagine talking pictures on your wall…and just when you think you saw it all with the LG OLED-W series last year, the company continues to impress with something even more groundbreaking. They announced OLED displays that can be rolled up, just like christmas wrap or a canvas, but having equal quality compared with their 55″ standard OLED models. Unfortunately it wasn’t quite ready for prime time on the CES floor this year, but it definitely sounds like another game changer.
Just imagine rolling up your 4K UHD-TV display as though you were rolling up the canvas out of its frame from a masterpiece painting by Matisse that is housed at the Louvre Museum in Paris.
I.P. Park, CTO of LG addresses the crowd at the LG Press Conference at this year’s CES. Photo by Marcus Siu.
Article and photos by Marcus Siu
LAS VEGAS, Jan. 7, 2018 CTO of LG Electronics, I.P. Park, opened up and addressed the early morning crowd at the CES press conference in Las Vegas and announced LG’s global AI brand called “LG ThinQ”. It represents LG’s commitment to think about “you”, as the user in all that we do.
It is also part of the equation for a smart home and a smart connected lifestyle, with the assistance of Google Assistant that will help bring innovation for a better life for the consumer with LG’s products.
According to Park, the three main characteristics of LG ThinQ is its ability to evolve with time, the integration of AI into everyday products, and a completely open platform so everyone benefits.
“With ThinQ, the term “live & learn” takes on an exciting new meaning…the more you use our products the better it evolves to meet your specific needs. The second characteristic of ability of ThinQ is the ability to integrate AI into a diverse portfolio of everyday products. LG makes many products that you use every single day in the home, on the road, and in the office. The third characteristic of LG ThinQ is the openness. So our overall strategy is always to use the most powerful AI solution to the users by utilizing an open platform open partnership and open connectivity.”
Google Assistance no longer just applies to LG cell phones, but in just in a variety of LG products. With the help of Google Assistance, you can check the status of your LG home appliances or tell your LG OLED TV to control your lighting, air conditioning or air purifier. The possibilities are endless.
David VanderWaal, VP of Marketing, LG Electronics USA demonstrated how LG AI is making people’s lives easier and better, both in the home and outside the home. His first LG AI demonstration was on laundry, which is an everyday task that people are faced with.
Everyday routines, such as laundry are now integrated with LG’s ThinQ AI. Photo by Marcus Siu.
“It’s Saturday morning, I get up, I first like to make sure that everything is ready for the day…so, LG ThinQ has now chose the very best wash cycle for me, based on my personal laundry habits..so, LG washers user frequently used laundry settings and repetitive patterns in everyday life to optimize the clothing management methods. For example, for instance, things like changes in the weather, air quality…all of this can be incorporated to optimize the washer settings for the clothes. These are things that the consumer often wasn’t thinking about. Our new capacity LG Styler identify NFC tags that are embedded inside the clothing and then automatically sets for an optimal setting for the clothing care. You don’t have to painstakingly examine the tiny tag on the clothing. Instead smart technology can effortlessly read the NFC tag and refresh the clothes with the power of steam.”
VanderWaal then moved on to the connected kitchen area with its connected AI appliances on the stage.
David VanderWaal, VP of Marketing demonstrates the InstaView ThinQ refrigerator at CES 2018. Photo by Marcus Siu.
“LG smart connected refrigerators, ranges and dishwashers make managing the kitchen easier than ever. The LG InstaView ThinQ refrigerator is the latest when it comes to smart AI in the kitchen, with a 29” full HD touch screen display that helps you look up recipes, check weather, and even do grocery shopping.”
With a double tap on the refrigerator touch screen display on the refrigerator door, you will see all the items in the fridge through the transparent door, which also takes inventory through tagging, like the washer. You can magically check what you need before going to the grocery store without even opening the door. LG not only thinks inside the box, but knows how to tap outside the box to get the information it needs.
VanderWaal then moved toward the living room part of the stage.
“You may wonder why does a television needs AI when we already have so many different intelligent speakers; well there are a few key advantages, the first is having AI on the television is the ability to control the television with just your voice. The second advantage is that you can now enjoy television with intelligent content information…you can even turn off the TV at the end of the program without having to schedule a particular time…and last but not least, you can get the most out of LG AI TV’s with the Google Assistant integrated right into the product.
AI on the new OLED TV’s are even more informative then ever before. Photo by Marcus Siu.
“Show my my photos from my vacation” commands David VanderWaal, VP of Marketing at LG Press Conference at CES 2018. Photo by Marcus Siu.
VanderWaal demonstrated how Google Assistant was utilized to show his photo album on the TV with the command… “Show my my photos from my vacation”, and also explained that you could order pizza straight from your television and even book your ride through the TV.
All of LG’s smart TV’s are equipped with LG ThinkQ AI technology.
David VanderWaal at the CES2018 press conference regarding the connected home with Google Assistant showing how to turn the purifier on. Photo by Marcus Siu
Tim Alessi, Senior Director, Home Entertainment Product Marketing, LG Electronics USA explained that there was more to announce than the major enhancements with AI integration and showed off the new proprietary Alpha 9 intelligent processor that will be featured in this year’s OLED TV’s. This is an upgrade from the Alpha 7 intelligent processor that was used in last years model.
Alessi compared the 2017 OLED with this years model, and even though last year’s was great; this is even greater. Utilized by LG’s Alpha 9, the most powerful processor to date, it delivers up to 50% more powerful image and data processing than last years model for overall better control of smart functions and advanced image processing such as more accurate color control. The bottom line is that viewers will be able to enjoy images with more astonishing color, clarity, and depth.
“First, by minimizing picture noise, the Alpha 9 uses a four-step process to reduce noise artifacts. Next comes depth and sharpness. Now, most TV’s use edge based depth enhancement, but LG TV’s use an object based enhancer which can precisely separate main on-screen objects from the background. “
Finally, we come to color…
“Viewers can enjoy more accurate lifelike color thanks to the expanded color look up table, which offers more natural color, expanding the reference color coordinates by over seven times compared to 2017’s processor. Photo by Marcus Siu
“Viewers can enjoy more accurate lifelike color thanks to the expanded color look up table, which offers more natural color, expanding the reference color coordinates by over seven times compared to 2017’s processor. Similarly, to 2018 OLED TV lineup, all of the Super UHD sets will offer a new processor for enhanced image processing. These TV’s will all include 4K Cinema HDR, Dolby Atmos sound, and the gallery mode.”
After last year’s Innovative award winning LG OLED TV came out, I never thought it could be better. I was wrong.
VanderWaal unveils a new line of LG robots: a serving robot, a porter robot, and a shopping cart robot at CES2018 in Las Vegas. Photo by Marcus Siu
Last but not least, VanderWaal returned back to the stage and unveiled LG’s CLOi robot line, which consists of a serving robot, a porter robot, which will handle various hospitality needs, and a shopping cart robot, which can provide automatic payment and grocery service at supermarkets.
“We’re dedicated at LG to developing new robotic technologies to meet and even exceed consumer needs…and we’re going to keep advancing our robotics hardware along with our AI technologies to propel LG to the forefront of the global market.”
Whether we live in our living room, our kitchen, our home theater, or in our cars, LG Electronics innovation along with Google and Amazon means a better life for all.
The LIFX mini bulb has an output of 800 lumens, rated 9 watts, and displays 16 million colors and can dim 1% to 100%. Photo by Marcus Siu
Article and photos by Marcus Siu
When I was at a recent getgeeked press show in San Francisco last month, I came across the LIFX booth displaying a gorgeous light display showing a few of their latest products, including the LIFX mini. Since it was an evening event, it demonstrated how lighting can really contribute to how it can create moods and atmosphere and be a critical component in your home.
Creating the mood by just dimming the lights with a dimmer switch and a regular light bulb just doesn’t cut it anymore once you see how dramatic your lighting can be anywhere in your home. I toyed around the idea of buying one a few years ago, but never thought it was necessary.
Boy, was I wrong.
After becoming frustrated with my 3-way incandescent bulb on my bedroom lamp over the years, mainly due to the elements burning out in the bulb in a few months and eventually becoming a “1-way bulb”, I decided it was time to “upgrade” my lighting needs. I switched my old drab GE bulb for a LIFX mini smart bulb on my old 15 year lamp.
In less than five minutes after I downloaded the LIFX app, I was in awe of the many modes and options that the bulb has to offer through the very intuitive app on my smartphone.
Not only could I turn the bulb on and off with a touch of a button through my smartphone, which was a pretty big thrill to me at the time, but I was able to dial up 16 million colors to choose from. No longer will I need to buy multi colored lights for holiday parties, which is another big plus.
The app has a “themes” function, which creates various moods. This is appropriate for various holidays, as it includes a “Santa”, “Holly”, “Hannakkah”, even “Halloween” and “Independence Day”. For non-holidays, if you’re in a certain mood, you can choose from “Mellow”, “Intense”, “Peaceful”, “Relaxing”, “Energizing”, “Soothing”…etc.
The app also has an “effects” function, as well. My favorite one is “flicker”, as though a candle was burning in the room, which can definitely set the mood for a romantic evening. With the “Spooky” setting, you can even scare the bejesus out of little kids on a Halloween night with outdoor lighting. Or perhaps, you might want to use the “Strobe” effects setting, if you want a DJ nightclub “trance” atmosphere, which also lets you speed up the tempo in milliseconds.
You can even set it to “Music Visualizer” and match to the music you are listening to. I have yet to put on Pink Floyd’s “The Dark Side of the Moon” or the “Wizard of Oz” soundtrack, but will definitely try it soon.
Other effect options are “Pastels” , “Color Cycle”, or “Random.
Along with your smartphone, you can use this with Amazon Alexa, Nest and Google Home, which makes it super easy to switch the mood to all of your lights in seconds.
What is great is that you can add and control as many lights in many rooms as you want, all at your fingertips through the user friendly app, which makes me want to get more of them. It is also nice to know that I no longer will need to scrunch out of bed after going under the covers and having to physically touch my lamp switch to turn off or on the lights.
“Alexa” – turn off the lights.
The LIFX mini lists at $44.99 and is available at retailers, such as Best Buy and Amazon.
In addition to the LIFX mini, the LIFX Tile Kit (left) is the latest product that are mounted on the walls of any room. The first batch of these have been sold out and won’t be available until February 2018. Photo by Marcus Siu.
“It’s Hugo and Jane…we were just blessed to work with these incredible artists…A lot of people lived extraordinary lives…few people can talk about them, and even fewer can have had every great moment filmed by one of the greatest cinematographers of all time”
– Brett Morgen, director
“People said my fame was due to my legs. It was so stupid —it didn’t bother me. It was really very useful because by this time I was needing to raise money myself, so I made use of it.” – Jane Goodall Photo courtesy of National Geographic.
Article by Marcus Siu
In 1960, Louis Leakey, the famed Kenyan paleoanthropologist who suggested the study of existing great apes could provide indications of the behavior of early hominids, including humans, was searching for a chimpanzee researcher. Instead of assigning a male with a scientific primatologist background where men dominated the field at the time, Leakey assigned a 26-year-old woman named Jane Goodall, who had no college degree, scientific or formal training, which is what he was exactly looking for.
Though Goodall may have seemed “unqualified” in the field, she was extremely passionate about animals ever since the age of ten and has always dreamed about living in Africa. In addition, It probably didn’t hurt her chances with Leakey that she was also an attractive blonde with very nice legs and was his personal secretary at the time. Not to mention, Mrs. Leakey probably wanted her out of the office anyway.
On her initial assignment, Goodall was sent to Gombe Stream National Park, a remote area of northwest Tanzania, to document and study primate behavior. There she learned to acquire the patience to sit and wait in the fields for weeks before even seeing any type of regular chimpanzee activity. Through her perseverance, she was finally able to blend in and gain the trust and acceptance from the chimpanzees as one of them. After this was established, a cameraman from National Geographic was sent in to help document the progress.
As Goodall observed the chimps, they no longer cared that she was present among their company and is mainly ignored as though she was one of them. This gave her an “in” to be able to observe the similarities between chimps and humans. She slowly discovers that they are highly intelligent and social creatures, much like humans. To watch the behavior of these chimps documented on film from Goodall’s point of view for the first time is a revelation. It’s almost as jaw dropping like watching the first man walk on the moon.
This documentary film is not just about Goodall’s early explorations and groundbreaking field work, but also about her intimate relationship with the chimpanzees, as well as the unanticipated love story with cameraman Hugo van Lawick, a perfectionist in his craft, who would later become celebrated as one of the greatest wildlife photographers and cinematographers of all time.
Photo courtesy of National Geographic.
“When Hugo was making that film, it was not like making a film today”, Goodall reflects, “…it was the old Bolex camera, and it was celluloid, and you had to put a black bag over your head, and you had to thread through all these sprockets and through the gate and close the gate, make sure there wasn’t a hair in the gate…it was difficult.”
Lawick was shooting as a one-man crew and the equipment was so much heavier (tripods, camera, three cases for the lenses) as compared to today. He was carrying a ton of equipment through the jungles of Africa. “There were only four or five stops of latitude back then, so to get an exposure on a chimpanzee in a dark forest made it a near impossible task back then. “It was nearly impossible to find a single frame that was overexposed, underexposed or out of focus,” Morgen says.
As for the film editing, it was quite an arduous task putting together 140 hours of unseen perfectly preserved 16mm archival footage with no sound into a structure into something cohesive for Morgen. His film editor had divided the stock and identified over 140 chimps, even though the film was primarily about four of them.
Morgen originally conceived the film to be a “cinematic opera” focusing on the music by Philip Glass without a narrator. However, that idea was thrown out after interviewing Goodall and realizing how intimate her stories were. It became mandatory to put Goodall’s story into the documentary.
The completed film was shaped and formed like an epic narrative romance as you get to see both Hugo and Jane fall in love on film. As it turned out after reviewing the final cut for the first time, you realize it was not so much a love story between Goodall and Lawick, but a love story between Goodall and Lawick with their prospective professions.
“Love is between a woman and her work and a man and his work”, Morgan states. “Most people have this romantic idea that the most important relationship in life is with your partner, your lover, your spouse, but for a lot of driven people, their primary relationship is with their work. I started to see that Jane and Hugo’s ultimate breakup was not a tragedy because they both pursued their passions.”
Photo courtesy of National Geographic.
After living and observing chimpanzees with their newborns, there is no denying that Jane Goodall became a much better parent raising her child. Photo courtesy of National Geographic.
“With our filter of film making today…there was an opportunity of something immersive to allow the audience to be on the journey with Jane” Morgen reflects. “Some of the advancing of color grading and in sound design that weren’t available 50 years ago helped us to bring us that experience.” In addition, multi-channel jungle sounds from actual stock footage recorded in Gombe was matched the archival footage. It took about 2 ½ years to finish the sound editing process, along with mixing it to Philip Glass’s brilliant and inspirational original score.
Though there have been a myriad of films over the last few decades made about the world’s foremost expert on chimpanzees, but this may be the definitive, intimate, and immersive document of Jane’s legacy brought to life.
At the last Pepcom’s MobileFocus, Kingston was on hand in San Francisco showing off their latest product, the Kingston Bolt. This is truly a lifesaver for those who are near capacity on their iPhone’s or iPad’s due to taking lots of photos or videos.
Especially if you have a 16GB iPhone, one of the most frustrating things is the fact you have only a limited amount of storage space for your device and can run out of room fairly quickly. Depending on how many apps and how often you like to take photos and videos, you may find yourself forced to delete much of your media from the phone itself.
How frustrating is it when the only way to free up space is by deleting many of your old photos and videos from your iPhone without being able to transfer them onto your hard drive or storage, just so you can free up space to shoot new photos and videos?
The Kingston Bolt has solved this problem. Think of it as extra storage or extension for your iPhone.
DT Bolt Duo
If your iPhone or iPod ran out of storage, you can attach this flash device and continue to take more photos and videos from your iPhone…or you can just transfer your iPhone’s media onto the drive and free up your capacity. Whichever you prefer. The choice is yours.
The Kingston bolt attaches via the lightning port of your phone and through the use of it’s app, allows you to transfer photos and videos from your phone onto the Kingston Bolt. From there, you can keep the photos as backup on the Bolt and/or load them onto your computer.
In addition, the app allows you to immediately free up your phone by deleting the photos and videos after transferring of data onto the Bolt. What’s great is you can do this without even launching the app once you immediately plug it in to your device. Once the Bolt is attached, it automatically opens the app for you. Then it shows the capacity of both the device and the bolt. The options on the menu are transfer photos to the bolt drive, capture photos directly onto your bolt drive, and view photos and videos on your bolt drive.
The transfer function allows you to choose all photos and videos, all photos, all videos, your favorite photos and/or videos, or select photos and/or videos. It shows you how much storage is in each of the options.
Once selected it allows you to keep or delete the original items from your device.
Personally, I wouldn’t delete anything off my iPhone until I know I safely transferred everything from the Bolt to my computer, just to be safe.
Once I do that, then I would then go ahead and run the transfer another time and ensure that everything was transferred correctly, and then delete the media from the phone or tablet.
It comes in a well-protected rubber case attached to a key ring that can go along with your key chain or whatever you prefer to attach it to.
It comes in three sizes: 32GB ($59.99 MFSL), 64GB ($89.99 MFSL), and 128GB ($119.99 MFSL).
DT Bolt Duo Packaging
$59.99 list price
Here is their press release
2017 Flash Press Release
Kingston Digital Releases Lightning USB for Apple iPhone, iPad
Fountain Valley, CA – August 30, 2017 –Kingston Digital, Inc., the Flash memory affiliate of Kingston Technology Company, Inc., the independent world leader in memory products, announced today its first dual interface USB with Lightning® connector, DataTraveler® Bolt™ Duo for Apple®1 iPhone®1 and iPad®1. It is designed specifically to free up space on Apple devices with limited memory.
With no option for expandable storage on iPhones and iPads, Bolt is perfect to free up device space. Its intuitive app is simple to use and the capacities allow for storing up to 8,000 photos (32GB), 16,000 photos (64GB) and 32,000 photos (128GB). Photos and videos can be taken directly on Bolt and its slim form factor, protective rubber case and keyring makes it conveniently portable.
“Users capture more memories every day so it’s only a matter of time before their iPhone fills up. Bolt is a great device to back up and free up storage space on iPhones or iPads.” said Andrew Ewing, Senior Manager Consumer Business, Kingston and HyperX. “It functions like a Flash drive for iPhone. Users simply plug it in, download the Bolt app2 and then back up to their camera roll. No more deciding which photos or videos to keep and which to delete when the dreaded “storage full” notification pops up.”
DataTraveler Bolt Duo is available in 32GB, 64GB, and 128GB capacities and is backed by a two-year warranty, free live technical support and legendary Kingston reliability. For more information visit www.kingstongo.com/bolt.
DataTraveler Bolt Duo Features and Specifications:
More pictures:Never miss a moment, snap more pictures with Bolt.
Easy to use app2:Too many complex apps out there. Bolt app is super easy to use.
Free up space:Transfer your memories and make space for new ones.
No need to wait:In a hurry? Shoot new pictures/videos and save directly to Bolt.
Portable:Easy to take with you using the included accessory.
Simple back up:Back up all those priceless photos, especially those selfies!
Made for iPhone®:Designed to be used with iPhones & iPads, lays flat when plugged in.
Multiple devices:Have more than one iPhone or work on both an iPad and iPhone? No worries, Bolt has you covered.
Capacities:32GB, 64GB, 128GB
Warranty:2-year warranty with free technical support
DataTraveler Bolt Duo
Part Number
Capacity
C-USB3L-SR32G-EN
32GB DataTraveler Bolt Duo
C-USB3L-SR64G-EN
64GB DataTraveler Bolt Duo
C-USB3L-SR128-EN
128GB DataTraveler Bolt Duo
1Apple, iPad, iPhone, and Lightning are trademarks of Apple Inc., registered in the U.S. and other countries. 2Works with iOS 9.0+
SAN JOSE, CA, OCTOBER 11, 2017 – At its fourth developer conference in San Jose, Oculus announced its newest and most affordable VR headset, called Oculus. The standalone device will ship in early 2018 starting at $199.
CEO Mark Zuckerberg announced this to the crowd of 2,900, where he announced new hardware, software, and content that will bring even more people into VR and expand the ways we work, play, and—of course—connect.
Zuckerberg hopes this will bring VR to the masses. Hopefully, a path to a billion people.
“If we’re going to get a billion people in virtual reality”, Zuckerberg remarked, “we have to keep working on both affordability and quality, but we also have to find the sweet spot in the middle that high quality affordable experience that doesn’t tether you to a PC.”
This certainly keeps the price of experiencing VR down to a minimum, as with the Oculus Rift, a super exclusive high end computer is needed, and that can certainly run up to a few thousand dollars on depending on how much power is needed.
Also no longer will you be need to snap an expensive phone for the Samsung GearVR inside it…which doesn’t even have motion tracking in it anyway.
Oculus also showcased updates to its Santa Cruz prototype, including new positionally tracked controllers with six degrees of freedom that bring the power of Rift and Touch to the standalone category.
The company offered a sneak peek into Oculus Venues, a new experience coming next year that lets people watch live concerts, sports, and movie premieres with thousands of other people around the world.