AVITS - A Virtual Environment Using Simplified Technology
This project consisted on the development of virtual reality environment for use by the School of Architecture Projects Department. The idea was to build a system that could be used by students to present and develop architectural and urbanism projects. The system intended to achieve three main goals:
The screen layout was inspired by the Penn State Immersive Environments Lab. The physical structure was divided into three rear projection modules, that could be independently mounted and then combined to create a single screen. A mirror was used to reduce the size of the module and avoid using more expensive short-throw projectors. The modules were also built on wheels to make the alignment and movement around the room easier. Each screen measured 1,70m x 2,27m and were positioned with an angle of 30 degrees from each other. In this way the it could accommodate a small audience, enough for the architecture student needs.
- be affordable,
- be easily replicated by other groups,
- be easily moved from place to place.
The screen layout was inspired by the Penn State Immersive Environments Lab. The physical structure was divided into three rear projection modules, that could be independently mounted and then combined to create a single screen. A mirror was used to reduce the size of the module and avoid using more expensive short-throw projectors. The modules were also built on wheels to make the alignment and movement around the room easier. Each screen measured 1,70m x 2,27m and were positioned with an angle of 30 degrees from each other. In this way the it could accommodate a small audience, enough for the architecture student needs.
The entire structure was connected by joints that could be connected without the need of a screwdriver or other tool. The projectors were positioned right above each module screen. Alignment was not very easy since the projector support was very simple.
The system used active stereoscopy and was powered by a single computer using a Matrox Triple Head splitter. The three projectors were seen by the system as a single high resolution display. We performed tests using 120Hz and 85Hz projectors. We discovered that PG-F211X projectors were able to sync with the video input signal at 85Hz, so could be used with active stereo in low light levels. However since it was discontinued, we couldn't rely on it and ended buying the more expensive 3D projectors. We used the nVidia 3D glasses.
The rendering engine used was the Irrlicht. Files
The rendering engine used was the Irrlicht. Files
For interaction with the virtual environment we decided to use a wiimote+nunchuck pair. The analog control was used to move around and the wiimote used to look around. We made some tests and the idea was easily grabbed by students and professors. Since we didn't have the original sensor bar, I built a another one using 6 infrared LEDs .