News Stories

AlloSphere Leads The Way In Integrated Multimedia Systems Research (May 29 open house!)

[Philip Lelyveld comment: the Allosphere is a walk-in 3D video, 3D audio fully immersive interactive environment at UC Santa Barbara]

[The Bottom Line]

“This is kind of like where video gaming meets high-performance computing,” said Dr. JoAnn Kuchera-Morin as she led awestruck onlookers through the AlloSphere, a virtual environment housed in University of California Santa Barbara’s Elings Hall.

The AlloSphere, which is a product of 26 years of Kuchera-Morin’s research and labor, is a spherical, three-story anechoic chamber that uses multiple projectors, speakers and a super computer to visually and sonically represent data. The multi-sensory display that characterizes the AlloSphere is derived from complex mathematical algorithms mapped by Kuchera-Morin and teams of researchers, and has been. …

The AlloSphere is not merely a demonstration; it offers an immersion of a highly complex mathematical visualization and anatomical data that would be frustrating for scientists to explore otherwise. …

The AlloSphere will be featured in the annual Media Art Technology End of Year Show “Bits and Pieces” on Tuesday, May 29 from 6 to 9 p.m. Those in attendance will have the opportunity to tour the instrument, which is rarely exhibited because it costs about $3,000 an hour to run.

“We’ll have all of our labs open and all of the Ph.D.’s will be exhibiting their work. We’ll show the AlloSphere as well. We’re anticipating hundreds of people,” Kuchera-Morin said about the annual display.

See the full story here: http://thebottomline.as.ucsb.edu/2012/05/allosphere-leads-the-way-in-integrated-multimedia-systems-research

Flexible Displays Landing in 2012, But Not in Apple Gear

[Wired]

…In early March, Samsung announced it would be mass-producing its flexible OLED displays, like the one seen above, by the end of this year. Now flash-forward to this Monday: According to a report from the Korea Times, Samsung is seeing “huge” orders for this display, and Apple is “likely” to be one of the major players.

Such a display could be useful in a number of applications, such as in a device with a gently curved screen. Ultimately, the display could even be deployed in a flexible, bendable phone or tablet. But that’s probably not on the horizon — especially Apple’s horizon — anytime soon.  …

Samsung’s flexible OLED display certainly has some advantages over current display tech. For one, it’s basically unbreakable because it doesn’t use glass, but rather a type of plastic called polyamide. …

Colegrove said there are two reasons why Samsung’s flexible OLED is attractive to device manufacturers. First, the display is thin, lightweight and difficult to break — this offers immediate design benefits. Second, any type of new, novel technology offers marketing benefits. You can hear the commercial spiel now: “We have the first flexible AMOLED display devices in human history!”  …

And if you’re looking to find a flexible display in an iDevice, you’ll probably have to wait until the 2013-2014 time frame, says Colgrove — with truly bendy iDevices appearing in 2015 at the earliest.

See the full story here: http://www.wired.com/gadgetlab/2012/05/apple-flexible-displays/

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.