Jump to content

maher

Members
  • Posts

    98
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by maher

  1. Hello Thomas, Can't believe that EIAS 9 is finally out. I just tumbled on that unexpected info almost by accident. I'm a registered owner of EIAS 8 and I didn't received any email about the 9 release. When I try to click an the «buy» button on your HomePage nothing happen... Maher
  2. Nah! Doesn't work. I have a mustache and everyone still looking at me like I was some weirdo, which my wife and kids can candidly confirm... Maher
  3. So Richard, this is what you looked like without your Incredible mask... Now we will all know you're real identity ;-) Maher
  4. The free Mac beta of Rhino is quite functional too and, if you're a student/teacher, you can get a full PC license for $200 with no restriction to use it for professional use. Maher
  5. Sorry Phungus, but that link doesn't work for me :-( Maher
  6. Ouelle Richarde, u r forget thing ouane amézingue fiture frome Mount tainted Lionel! It ize the ante grated dictation alla Siri. I kanot leave without tits. Maher
  7. What are the advantages of Vimeo over Youtube? Maher
  8. Hey Tomas, How about some renders to see how awesome EIAS 9 will be? Maher
  9. OK, this is probably a very dumb question, but it bother me from a long time : I thought that most modern render engines take into account only visible polygons. I understand that when doing GI, invisible objects and polygons into a scene can contribute indirectly to the lightning of visible polygons. I understand also that in a multi-million polygons scenes, you selectively and manually, remove objects that doesn't matter too much in the scene when they are out of the camera field, for faster rendering time. My question is, why don't someone code the rendering engine so that happen automatically, in a first pass calculation, removing polygons that are at out of the field from some defined distance or angle? Why not reducing automatically those objects by decimating their polygons counts in the same way, when it is important to take them into account for GI secondary influence? Maher
  10. Wow! Not only did you do it in just a few days, just to prove it's feasible in EIAS, but you've done it with a 4 x 4 rubik cube... I'm very impressed Incredible Rick ;-)
  11. Alas, They just announced that it was a mistake and they'll patch the future versions :-( Maher
  12. Alas, They just announced that it was a mistake and they'll patch the future versions :-( Maher
  13. Diego, why do you animate the stereo separation? In real life our eyes always stay at the same distance from each other; they just convergence as needed to a new point of interest. I always use convergence by reference. You can attached the reference point of the camera to a null object that you keep at the approximate same distance than the main point of interest of your view. If I use convergence at infinity, I end up with movies like the one produced by the GoPro Hero 3D kit. The software they provide allow to keyframe the convergence zone through the movie. We could always use such a feature with CG stereoscopic renders but we need to frame a bit wider and render at some bigger resolution in order the compensate for the zooming needed to overpass the border offset caused by the change of convergence point.
  14. Hi Diego, you nail it! My camera was still in stereo mode... Yes I can see side by side stereo by crossing my eyes, but I always wonder what everyone would look like if we have to do stereo iChat with this technique ;-) Thanks again Diego and Thomas! Maher
  15. Yes, exactly same settings. Weird! Maher
  16. I think it's the latest : OS X 10.6.7
  17. I don't know why, but my render windows no longer pop-up in Animator at the end of a preview; I must go to the finder and manually open it from the snapshots folder in EIAS 8 folder. Anyone have an idea of what may cause this? Maher
  18. Hello Diego, The active shutter glasses had an edge over polarized glasses in term of resolution only on actual 3D TV. In theaters, the RealD system use circular polarized glasses and a unique projector (not 2 as in iMAX) with an electronic apparatus that alternate, for each full resolution frame, the circular polarization of the light going out of the projector. Samsung is working with RealD to apply such a system to 3D TV set (active 3D TV at full resolution and passive polarized glasses). Now translate such a system to a computer monitor that produce on the fly 3D CG scenes with head tracking to recalculate the virtual camera position and you get a real immersive system. One of my graduation fellow worked as team manager on such a system at the Montreal flight simulation company CAE. They projected the CG data directly on the retina, tracking the fovea and computing a high resolution image of 1024 x 1024 just for that high details region. For the peripheral vision, they generate a «low-resolution» image of also 1024 x 1024. The first prototype produce the equivalent resolution of half what human vision can perceive and they get even with the 2nd generation. Alas, that incredible project was too expensive even for the military and they never gone into production...
  19. Hello Diego, why do you find the active shutter glasses better than the polarized one? All those 3D TV sets with active glasses are staying on the shelves of the electronic stores because people doesn't want to bother with recharging units that rapidly decay and heavy expensive glasses. The 3D TV sets with passive glasses will probably supplant the first 3D TV generation. For now they have to cut in half the vertical resolution but if Samsung can really produce an hybrid solution (active 3D TV at full resolution and passive polarized glasses) at a decent price, I think it will be a winner.
  20. Hello all, I'm not a big fan of 3D TV sets and stereoscopic movies but I always wanted to do stereoscopic CG animations just for the fun of it. Stereoscopic CG scenes is just one step closer to a perfect simulation of a real, or a totally unreal universe if you are a sci-fi fan like me. Since the prices are going down for 3D TV set, due to lack of public interest, or the lack of pertinent content for the matter, I'm considering the cheapest way to produce 3D blu-ray tests from my EIAS files. The cost of professional mastering 3D softwares was out of reach until recently for a hobbyist. But thing may change soon : The company that produce those cheap HD action camera called GoPro made a kit where you put 2 of those tiny cam in a waterproof case and sync them together to produce parallax movies that you mux together with their free software called GoPro Cineform Studio (in fact, GoPro bought Cineform to make a minimal version of their pro app). GoPro Cineform Studio is free, but the catch is that it only mux files from the GoPro Cam. After trying, with mixed results, to hack my EIAS rendering to make them work with the app, I decide to download the trial version of the professional version of Cineform (NeoMac). The trial app work without limitations for 10 days and is a lot more complex than the GoPro Cineform Studio. The good news is that after contacting the support team at Cineform, I've learned that they're planning to release a version called GoPro CineForm Studio Premium that will work with any source files. I don't know yet the price tag of that piece of software, but it should logically be under the $300 mark of Neo, way under the $1K and more price of other solutions. The technology they developed at Cineform is very wise. They multiplexed in a QuickTime file metadata so the left/right channels are hidden from the QT player or the NLE app. You don't need to import left/right files separately and be careful to keep the syncing and applying effects on both tracks. You just worked as if the import files are 2D. A plug-in add a permanent menu that let you choose the way the QT file must be displayed (anaglyph, side by side, interlaced fields, etc...). Now, I have to explore how to burn on a Mac a 3D Blu-ray compatible disc that will correctly play on a 3D TV set. Anyone already find such a solution?
  21. Hey congratulations, Richard, Dominique and Philippe! You should taught Dominique to use EIAS 3D next time. 9 months of rendering is way too much ;-) Maher
×
×
  • Create New...