#Love

Full stack node.js data visualisation app tracking worldwide Twitter mentions of #Love set to a Miley Cyrus 'Adore You' full screen video and cross-platform mp3 audio fallback.

Tool kit

View Project

This project started as an investigation into npm Twitter. There's two main ways to use the Twitter API, REST(ful) and Streaming.
I've used the REST API a few times but never had a chance to look at the streaming API.

Upon doing so i discovered a problem. I wanted to use the 'track' functionality which targets tweets with a specified hashtag. After the initial setup i was finding it quite difficult to test as the hashtag I was using, '#Javascript' wasn't used enough for my app to pick up on it.

After a bit of experimenting with other words, '#ok', '#cats' '#justinbieber', I discovered '#love' returned the most tweets.

Now that I had a hashtag that was used often enough to see sufficient and regular responses, I decided to create a project around it.

At first I was just going to have a stream list that displayed the tweets containing #love and add a low poly heart each time #love was tweeted. Since a lot of the development was going to be using three.js i thought I'd make the layout part pretty standard looking and so I gave Material Design Lite a try. Turns out MDL is pretty heavy and actually quite horrible to work with.

However, I do sometimes wonder if these frameworks are truly the answer. I think i'm just a fan of actually understanding what I'm coding, so ended up just opting for good old vanilla CSS.

This was what iteration 1 looked like. The reason for adding a Miley Cyrus video in the background... well, what project isn't improved by Miley Cyrus!?!

I was happy with the general idea here, a new heart for every time #love is tweeted but something was missing.... I must go full screen Miley.... and so what was a video header became a full bleed background video.

It can be a bit tricky getting 16/9 video to appear correctly, but using 'min-aspect-ratio' and 'max-aspect-ratio' media queries to control the position and scale of the video gets round this.

@media (min-aspect-ratio: 16/9) { .video-bg video, .video-poster img { height: 300%; top: -100%; } }

@media (max-aspect-ratio: 16/9) { .video-bg video, .video-poster img { width: 300%; left: -100%; } }

The project was now starting to take shape, but since I'd ditched MDL I did need to jump back into Photoshop to choose some fonts, colours and to work out how this would look on device.

alt

The CSS for any RWD site is fairly easy to get to grips with and previewing a project in Google Resizer really helps. But cross platform web apps aren't just about 'how it displays?', there's also 'how it works?' and since I'm using auto play video it's not really going to work on anything but desktop. So, a graceful fallback solution is required.

For this i'm using, Modernizr or more specifically Video auto play

Modernizr.on('videoautoplay', function(result) { if (result) { App.config.media = "video"; App.addVideo(); } else { App.addFallback(); App.config.media = "audio"; } });

The above check is run on page load which decides which 'media' method to inject. If auto play video is supported run 'App.addVideo()' else 'App.addFallback()'

App.addVideo = function(){ App.dom.videoBg = document.getElementById('videoBg'); App.dom.video = document.createElement("video"); App.dom.video.setAttribute("id", "video"); App.dom.video.setAttribute("loop", true); App.dom.video.setAttribute('poster','/videos/Adore_You_720x480.jpg'); App.dom.vidSource = document.createElement("source"); App.dom.vidSource.type = "video/mp4"; App.dom.vidSource.src = "/videos/Adore_You_720x480_800kbps.mp4"; App.dom.video.appendChild(App.dom.vidSource); App.dom.videoBg.appendChild(App.dom.video); }

or

App.addFallback = function(){ App.dom.audio = document.createElement("audio"); App.dom.audio.setAttribute("id", "audio"); App.dom.audio.setAttribute("loop", true); App.dom.audSource = document.createElement("source"); App.dom.audSource.src = "/audio/Adore_You.mp3"; App.dom.audio.appendChild(App.dom.audSource); }

I borrowed this heart from BlendSwap and used the three.js JSONLoader to load the heart model ready for cloning whenever #love was tweeted.

Then, using npm Twitter and socket.io to track mentions of #love...

var io = require('socket.io').listen(server); io.sockets.on('connection', function (socket) {
client.stream('statuses/filter', {track: '#love'}, function(stream){ stream.on('data', function(obj) { socket.emit('sending-data', { tweetText: obj.text, userName:obj.user.screen_name }); }); stream.on('error', function(error) { // console.log(error); }); });

...which then passes them over to a function that listens for stream data and clones the new heart and creates a new DOM element containing the username and tweet text.

App.socket.init = function(){ App.listen = io('http://pauliescanlon.io/projects/love'); App.listen.on('sending-data', function (data) { if(App.config.running === true){ App.socket.createDomElement(data); App.hearts.clone(); } }); }

That's pretty much the bones of the project, though there are a few other bits in there too. For instance if you minimise your browser window or open a new tab the 'media' is paused using a 'visibilitychange' change listener and a clean-up function which unloads the low poly hearts after they've disappeared from view.

The project is up on my GitHub if you would like to know more.

...and for the record, I'm not a Miley Cyrus fan and would be quite happy to never hear this song again :)