top of page
Demo:

 

We have a basic end-to-end communication of the entire project. We can see that the different affects in the movie are translated to different light combinations on the LED in the board.

 

Right now, the application is loading the subtitle file, conducting real time analysis and then conveying the affects onto the board.

 

Looking forward we will hope to finalize our PCB design and incorporate the motors in the project soon.

Working stages of the project.

 

In the project so far, we have completed the following steps in the project:

 

  • User Interface: An easy-to-use graphical user interface that allows

    • Choose the movie to be played

    • Display thumbnail of the video

    • Video player pops up to start movie playing

  • Text Processing:

    • While downloading or loading the movie, the closed captioning or subtitle file is also loaded into the backend for processing

    • Effects corresponding to the situation in ‘[]’ of the closed captioning are conveyed to the jacket.

    • TIME: effects have to be executed in a very tight range of time

      • ms of time difference will lead to effects that are out of sync with the video

    • ACCURACY: accurately analyzing and tagging words in the closed captions to output effects that correspond to appropriate actions/emotions

  • Connectivity:

    • Wired connection (USB) and wireless communication with the jacket is successfully working.

  • Mechanical working of jacket components:

    • Have sensors and actuators to accommodate the different kind of experiences. As of now, looking to incorporate:

      • Heat/Cool

      • Vibration: Different groupings of actuators will lead to different effects discussed ahead.

      • Read and analyze different emotions from subtitles and intitiate reaction through jacket.

    • Powered by Battery

    • Randomize location of punch and shot.

 

 

Final Demo Sample video.

bottom of page