First Media Appearances

In the past three weeks we appeared in the local news twice. First in the newspaper and second on WSCH6, our local news broadcasting center.

Larry Grard, from the Tri-Town Weekly, interviewed us after school just before Christmas break. He then wrote an article that was published on Wednesday January 8th. The online version of the article can be found here.

After the article ran WSCH6 contacted us and expressed interest in interviewing us as well. Tim Goff, from WSCH6, came and talked with us on Thursday, January 16. The story aired the next day at both 6:00pm and 11:00pm as well as on Saturday, January 18 at 10:30. The video and a write up done by WSCH6 can be found here.

Clip from video


Maya Animations

We decided to make an animation of the submersible in maya so that people could get an idea of what the end product might look like or do. The first attempt was not very successful as we were working with parts of maya that we hadn't used before.

The First Attempt:

We used the one body submarine because it seems that that is going to be our final choice. We started by creating a sea floor, then added in the submarine and a camera (with aim).

Two lights were used, one directional light aimed downward from above the scene, and a spot light that went on the from of the submersible. The first was not very bright, and only existed so that the sub and "unlit" part of the scene could be seen. The main light went on the front of our sub as a spot light because that is how the submersible will actually be seeing; after a few meters there is very little light and so the submersible will have to provide its own.

As with any photorealistic rendering we then spent a fair amount of time making the submersible, rocks, and sea floor look natural and life like.

After animating the scene (just the sub and camera) a few different times we decided that it looked quite boring and we added in an arc shaped rock and a smaller rock on the sea floor.

Finally we decided that we wanted to add in seaweed. To do this we used the "squirrel fur" option in the furs menu. We then changed it to be long, green, and more sparse than actual fur. Also we made it scraggly and have a much thinner top than bottom so it would look more seaweed like.

Finally we made a fan that affected the fur, through a hair system. And animated the seaweed so that it waved back and forth like waves washing over them.

Finished First Attempt Unrendered

Then came the hardest part of trying to make a video from the animation. We rendered 96 frames as iffs to start off, but were unable to find a converter. Then we realized that with QuickTime 7 Pro we could convert jpegs into a .mov.

Once we did that we decided that we needed to make it longer and had to fix some of the lighting. Thus we turned it into 144 frame animation and re-rendered a few times, and tried converting a few times until we finally got it to work.

This is the final animation at 12fps

The Second Attempt:

This past Monday (1/12/14) we learned that we were being interviewed for Channel 6 on Thursday. We decided that the more basic animation was not going to be good enough for television and so decided to make a better one.

We then decided to make a better version of the animation, and in the next four evening spent a total of about 24 man hours on it.

We first took the new SolidWorks model of the submarine that Jon Amory helped us to create. It has the main thruster as well as a flashlight and pi camera on top of the basic shape. 

Possible Final Submersible

The first thing that we did was locate the preset that takes place underwater. It was hard to find and we had to resort to using the find function on the computer. It was very helpful though as there was a light that made it look like anything put into the scene had water moving over it, and it had fog that reduces visibility in the distance.

We then created the squirrel fur seaweed and the rock again. Then we worked on making the submarine and rocks look realistic. 

Then made the light for the front of the submarine and made it part of a group.

Then we did the actual animating of the submarine and the camera (with aim), as well as the fan for the seaweeds motion. 

Then spent a fair amount of time getting a geyser that emits bubbles to be at the very end of the animation. As well as a mass of crystals, which were already in maya under PaintEffects. 

Finished Second Attempt Unrendered

Second Attempt Full Screen

Here is what the final animation looks like.


RasPi Camera: GStreamer-1.0 w/ Windows 7

As talked about in our previous post, the MJPG-Streamer video rate using the Pi's Camera module was definitely not acceptable for our project. The maximum speed (with dropped frames)of raspistill was far below the video quality needed for our project. After researching multiple different streaming methods we settled on using GStreamer-1.0, an open source visual and audio streaming platform. This method is faster as the software opens up a direct network pipeline between the Raspberry Pi and the OCU (in this case a Windows 7 machine).

This tutorial is Windows 7 specific, but will work on Mac and Linux. The tutorials for Mac and Linux are actually much easier as GStreamer was originally created for those platforms.

1a. First prime the Pi by adding the follow to /etc/apt/sources.list:

2a. Then update the Pi to download dependencies for GStreamer:

3a. Finally download GStreamer-1.0:

4a. The download is complete. To pipe raspivid through GStreamer use the command below.

****Make sure to use this specific code because GStreamer-1.0 for Windows does NOT include gdpdepay or gdppay as plugins.

1b. In Windows, visit http://gstreamer.freedesktop.org/data/pkg/windows/1.2.2/ and download this (make sure you have the correct installer -- Windows 7 comes with a pre downloaded .msi installer, but other versions do not):

2b. Install!

3b. Using cd and dir navigate to /gstreamer/1.0/64_x86/bin

4b. Now you can use the get-launch-1.0.exe command to read the stream coming from the Pi (make sure to use the ip you found using ipconfig above!).

As you can see the quality of the stream is as good as the Pi Module can give. Unlike the jerky MJPG-Streamer, the GStreamer pipeline has <.3 seconds of delay!! This is perfect for our Pi-based submersible. Unfortunately, there is not yet a Java API built for GStreamer-1.0. We will be defaulting to the older version (GStreamer-.10) in order to use this video streaming software.


RasPi Camera: MJPG-Streamer Comparison

After playing around with different USB webcams we decided to test out video hardware/software specifically made for the Raspberry Pi. Although having the ability to attach a USB cam to the Pi opens up many possibilities, our hope is to have a video platform which streams high definition video with the lowest possible latency. While USB cameras have a relatively nice picture, they have higher latency than we would like. Below is a quick run through on how we setup the Pi Camera and it tested against a USB camera using MJPG-Streamer

Pi Camera

We used the Pi Camera Module (found here). A picture of the module is below. To view the post where we got a lot of our information on the Pi Module check out this blog post: 

To attach this hardware find the black clasp near the Ethernet jack and pull up on the tabs. Take the ribbon cable on the module and slid it into the clasp (making sure that the wires on the clip and cable are touching). Now press down on the tabs. Done! The module is correctly attached to the Pi. A picture for reference is below: 

Next you need to enable the RasPi Camera from the Pi's configuration window. Do this by running:

And choosing this option in the configuration window:

After rebooting both update and upgrade the Pi:

To compare the streams we are going to test MJPG-Stream on the Pi Module and a USB webcam at the same time. We will not be testing the native streaming at this point because of a few disadvantages (ie. you must always be done before you tell the Pi where to stream video to). Use the links on this page to help you setup native streaming with the Raspberry Pi Camera Module. Otherwise, continue reading.

Next you have to install MJPG-Streamer. Documentation for this can be found on THIS post (or this very useful blog). After MJPG-Streamer is compiled you must make a file to which the RasPi video is copied to and then read from by MPJEG-Streamer.

First make a directory to which you can save the images:

Then make it writeable:

*****you must constantly created /tmp/stream and make it writeable or the stream will not work!

You now mush run a raspistill image capture at a very small interval. This is used for video because MJPG-Stream is only compatible with jpeg images and not the h264 video created by raspivid. The command below will have the stream capture every .1s: (for information on the meaning of these commands check out this blog post).

Then run MJPG-Streamer. MJPG-Streamer will both read the pic.jpg created by raspistill and write it to a web-server.

To log onto the webserver type in the <http://ip:port> into your web browser. In this case it is <http://port:9000>. Here is an image of what you should see: 

Comparision: USB and RasPi Module:

After setting up the Pi Camera module we tested it against using MJPG-Streamer and a USB webcam.

Pi Pros:
  • Small!
  • High quality
Pi Negs:
    • Easily backed up (skipping frames because it cannot write any faster than .1s

USB Pros:
  • Faster
  • Customizable
USB Negs:
  • Larger camera
  • Not as nice quality for the price


We would recommend using a USB webcam for streaming live video from the Pi using MJPG-Streamer because of its fluidity compared to raspistill. The raspistill feature on the pi camera is meant for taking images and does not take them at a pace not fast enough for a robot. We have not given up on using the Pi Camera module though...stay tuned for a tutorial using the camera's native streaming software! Next time we'll compare the native streaming against MJPG-Streamer.