The project is an extension of the original Ecoid visualisation that used the data provided from the DAT Ecoid Project to create a biologically inspired, visual representation by getting each individual Ecoid’s data and displaying it as an object similar in appearance to a cell made earlier in the DAT302 module. Like the earlier project the base idea is to create an interesting visualisation for the Ecoid system using the live data that is available. Its cell’s “biology” will be altered by the different data modules such as light, temperature, humidity, stretch, time the data was taken, its GPS coordinates and which number Ecoid it is. We wish to use the final project to further develop the original visualisation to create a second version which furthers the biologically inspired aspect of the original and create a more dynamic and ecosystem-like visualisation.
To develop the ideas behind our project, we intend to build upon a previous project, “The Ecoid Project”, we created within this module. Our original project used a few of the Ecoids and created a static environment for them to be displayed.
Our expansion on the initial project will first of all improve and fix bugs we had first time round. Then we hope to change some of the Ecoid outcomes. One instance is that instead of the temperature affecting the ‘shaking’ of the Ecoid, it will become a hotspot and “attract” the colder Ecoids, like them “huddling for warmth”.
Another expansion will be to give the Ecoid’s some kind of AI system allowing them to interact with one another and allowing them to continue “growing” the longer they are producing similar results.
We wish to explore the use of live feeds, Processing and data visualisations to ultimately produce something that can be used to display the health of all of the Ecoids in a way that is visually interesting to both people that are working on the project analysing the data and to strangers to the project.
After discussing which directions we would like the second version of this project we brainstormed ideas and came up with the following notes:
Upon ecoids being consistently equal to a similar value (e.g. always hot/never light), ecoid will change further and ‘evolve’. Suggestions for added traits:
Plan of Work:
Production Timetable; plan of work and time scale for developing and completing the project.
The foreseen end result of the project is that we will have a visualisation that displays the Ecoid data sets in a biologically influenced manner. It should act as a semi-simulative visualisation that acts upon the ecoid ‘cells’ to create a visualisation ‘ecosystem’. The visualisation should be able to make people think about the data gathered differently from simple statistics.
- The anticipated outcome of our project is that we have a visualisation that can be used as a tool to analyse the data gathered via RSS feeds from the Ecoids for the use of developers and interested onlookers.
- It will also act as an educational tool for the schools that some of the Ecoids will be placed to show the development or patterns of the eco system around.
- It can also be useful to people interested in eco science and would like to see the information in a way better suited to more visual learners, using colours and shapes rather than numbers and symbols.
We will produce a sophisticated visualisation of the data gathered from the Ecoid project that DAT is currently undertaking. It will display data in a new way that will simulate the data in its own self-contained eco-system which will allow the Ecoids to take on “life” and interact with each other. While on face value this project will appear to be just a visualisation, it will be in fact a simulation of the data as if they had organic characteristics. It is a large improvement on a previous session in which we created a very basic version of this.
We will be using our technical skills to display the information using the RSS feeds from the Ecoids that will be/have been deployed around the South West and inputting them into Processing, using several additional libraries to enhance both its look, performance and ability. Our conceptual skills will be present as we are using the data as I mentioned previously as a biologically inspired and influenced simulation designed to give the data more life and hopefully make the information easier to analyse.
The Blind Watchmaker:
Spore Prototypes (Net City, Cell Culture):
The Ecoid Project 1.0:
The Blind Watchmaker:
For our penulitimate project, we were asked to visualise data again, but for a specific purpose: to visualise ecoids.
Now, ecoids are small devices that collect weather and environment data using XBEES from around North Devon and part of a DAT project lead by Lee Nutbean. We saw a project created by Simon that visualised each ecoid on seperate visualisations, so we decided to visualise all of them on one screen. Mike Philips seemed to like the idea as it could potentially be something he could use, rather than the more physical projects other students were doing.
Although the data was from visualising weather data, the word 'Ecoid' sounded like something biological and so we took that inspiration to make each ecoid look like a cell. The humidity affected the size of the ecoids, the light levels affected the glow around them, temperature the proximity to the center of the screen and the stretch value would make the background bubble move around more.
This project for me was an odd one, even for DAT.
We were asked to create a machine that could detect if someone was in love or not and split into groups and each given tools to do this. Unfortunately we were given an Automagraph, a device made from two glass pains laid horizontally with three brass bearings in between. The user needed to rest their hand on it and were shown objects that represented something.
To try and measure 'happiness' as a trial run, I put my hand on the Automagraph while Sam showed me a presentation he put together of all the things he thought made me happy. To show my hand movement, a graphics tablet was attached with the pen attached to the end so we could measure how my hand moved.
The results seemed to prove that I moved when I was happy, however subconsciously I may have moved to see if the device would work or not.
Afterwards we were set to combine technologies to make one Love Machine. Unfortunately the group as a whole decided that the newer technologies would be much better so we really didn't have much to contribute in the discussion..
I presented my rendition of Superstition to the class and seemed to get a fairly positive review after a while.
I explained my process, how I entered in the values and played it. It took a while and a little of me pointing out some of the notes but most people seemed to get it in the end. Here is my final project:
After many a Google search I found an add on for MaxMSP called Cycling '74 which allowed me to enter notes from an .xml feed.
I quickly regretted my 'braveness' as I remembered that I had neither any musical talent or the ability to read music. I managed to find some sheet music to Stevie Wonder's Superstition and decided that it was going to be the song I will re-represent.
I then worked out from the notation the notes themselves, how long the note went on for and the pitch of the note which took many, many hours.
For this project we were introduced to a program called MaxMSP. MaxMSP is a program that gives you the ability to create unique sounds, stunning visuals and allows you to make engaging interactive media. I had never used this program before and was a new experience for me.
Its interface was very object orientated, which suited me as I dislike coding heavy programs.
We were given the first session to have a play with the program and were given a crash course in its functionality. The example included patches to run midi, audio, video, Arduino and make_controller.
The second week I brought it my own Kinect to see of we could use two to create a cool 3D image, however I was reminded swiftly this wouldn't work as it took us a long time to get one working again we didn't really have time to create much.
When we did, we started work on mapping our faces to it and creating a 'Mount DATmore'. We got the faces done and were reading to continue however when we wanted to work on it at home, we discovered we didn't have the software we needed to use it and that we exported it wrong.
For the presentation, we explained our situation and we then used the uni computers to show him what we would have done.
Due to the difficulty of this task, I may rethink my final project.
For this project we were given Kinects to play with. Something I am very pleased with as I hope to use one for my final project and could use the practice, so I asked to 'take the lead' on the this project. For this project I was in a group with Sam Stein, Scott Addelsee & Tom Saunders.
As it turns out using the Kinect with 3DS Max is really difficult to use. We all tried to get it to work, but none of us had any idea how to get it to work. Eventually, with a lot of help, we got somewhere but didn't have an idea.
For our first project I got in a group with Sam Stein, Scott Addelsee & Tom Saunders, we were asked to visualise any RSS feed data in an interesting way.
We decided on a feed that we thought would be cool, one based on earthquake data and went with a familiar program to run it in, Flash.
After a week, we came back with an interesting visualisation of square shaped blocks in a 'box' that shaked around according to the earthquake data using a physics engine to simulate it.
I think (fingers crossed) it’s finally done! I’ve set up a page to demo online for people to see and it measures the values of Heinz Beef Ravioli. The Android app scans the product, directs the user to this page, brings in the data from the database and displays the relevant information.
You can also download our app here:
While researching, Sam came across a table with how long it takes to burn off calories doing various sporting activities. This I think is a great feature and has been added to the database as well as the table.
I also managed to get a product working, so I went out and bought some items to test this with and has no problems as yet.
We have decided on an official, non-chocolate bar related name on ‘FoodScannAR+’ as it will be scanning food, it would be augmented reality if we have time to put this information around it using a web app and the plus is there for no reason.
We wanted our webpage to be something more than just text and numbers, so we took inspiration from this infographic:
This infographic is easy to understand so we wanted to emulate it in some way. I found a nice jQuery library (cue countless jQuery YouTube tutorial videos) which is exactly what I need to make it more visually interesting.
I think the use of graphical information is important as it will set itself apart from other similar apps.
To add more difference from this and other apps, we decided on some categories which are as follows:
- Daily Allowance
- Health Effects
- Source of products
Scott has decided to take another approach and is using Processing instead so I was left to finish the Android app myself. It took a couple of days in all day but I think I managed to get it to work and so have now installed it on my phone. It has replaced my original barcode scanner which is nice as I can still use it to scan QR codes. This will make a nice unexpected ‘extra feature’.
As we have heard nothing back from any of the people we called, the research was lacking results and a panic induced midnight trip to the library to look through law journals to find anything that might have come up, we have decided to stick with healthy diet, but enhance this part of our original idea. The immaterial aspect of the project will hopefully add another layer of information to users with dietary requirements/needs.
After a few hours yesterday of using Eclipse for some reason all our libraries now refuse to work and we had no idea why. As we found out several hours in, we needed to make sure the SDK was set to Android 2.3, not 2.2. This was a large setback.
Sam has also started constructing our database and was able to use some basic lines of php to bring in values from the page.
As our projects are similar in the fact we all need to barcode scanner that searches our database, we joined part forces with Scott Addelsee to try and crack the Android app. We were using Eclipse but we didn’t get too disheartened and started adapting the code as we saw fit. We started by changing what the Google Shopper part of the app did as it already pulled in the barcode value (upc) and placed it in a predefined URL which was what we wanted.
While testing this app the barcode we are using to test was a Kit Kat Chunky wrapper and so as we don’t have a name for this yet it shall be henceforth knows as ‘Kit Kat Project’, until a better name arrives.