Skip to main content

Research

In the course of our research, we discovered many tools that will help us achieve the project goal. This document contains the research that was done for both [Dance] and [Dance Console], thus it is the [Dance Project] Research. To read about the conclusions drawn from the research one should read the complete Dance Project Description and its accompanying documents.

Technical


Question: How do we get data from arenas to form the arena layout and map the devices?

Answer: In our research we found many [API]s that give us much information but they are all very limited and expensive.
We studied integrations with Ticketmaster, SeatGeek, Mappedin, Seats.io, Mapwize, Steerpath, Indoor Google Maps. Here you can see pros and cons of each one:

APIProsCons
TicketmasterIt has many event details and we can use them to check how many seats were bought and other additional information.If our client doesn’t have a Channel Partner key, we can’t have the ticket information. The Section Map doesn’t return much data, it only returns an image that could be used to design our arena if we implement artificial intelligence someday.
SeatGeekTheir API is very detailed. There is information about ticket pricing as well as information about the venue like coordinates.It says in the API documentation that they have no plans to expose individual ticket listings via the API, so we can’t get any ticket’s information. We couldn’t find any information about the arena area or any kind of drawing.
MappedinIt has already pre-build solutions and a great UI.We can’t use our own UI with this one. They communicate only through their own [SDK] for iOS, Android and Web (Javascript package), without an API Rest.
Seats.ioIt can be used to draw an arena and sell the tickets.We can’t use our own UI with this one, we can only use html code to display the UI and we can’t save arena data into our [database]. They charge per seat because it’s an API created for selling tickets.
MapwizeIt has good support for [Flutter]. It’s a good tool for designing indoor mapping.We can’t use our own UI with this one. This API is great to design maps for malls, but it can be more difficult to use it for designing arenas.
SteerpathIt has terrible documentation, we couldn't figure out which data we could get from their API.
Indoor Google MapsIt has a great integration with [Flutter] and it’s easy to use if the arena is already mapped.We would need to send a floor plan to their email if there isn’t already a map for an arena.

Because of the [API]’s limitations, we will use a service we created to design custom arenas (similar to Seats.io) and in some cases we will use Indoor Google Maps. We will also have pre-build models for well-known arenas.
Our first approach for the prototype is to relate colors with time and sections, then we will implement GPS, because the first version of our design arena service still doesn’t relate coordinates with sections. When we have the coordinate system ready we will relate GPS location with time and colors. For the prototype we could use the solution that we discussed about letting the user input the section he is currently in.


Question - How do we get the ticket and seating data?
Answer - For the prototype, we decided to let the user input his own location. For the MVP we can use another platform we will build that sells tickets and using our API we will be able to get all the data we need.


Question - Research and document [API]s that could be used to enhance the product.
Answer - As mentioned, we found many APIs that could be useful while we are developing the application, many of them are already listed under the last question answered, but we also found [API]s to get music information:

Music APIDescription
YouTubeWe can get a list of music and use a player inside our app to get the notes and decibels.
SpotifyThere is a plugin here that helps with the integration with their API. We can also use this package to play a song using a URI, but it requires user authentication.
AuddWe can use it to recognize the music that will be playing live and related to the pre-build sequence. This API also gives us links to preview the songs.
DeezerWe can use it to search for music and play.
iTunesWe can use it to search for music and play a preview.
lastFMWe can use it to search for music and play the whole music.
AcrcloudMusic recognition.
GracenoteMusic recognition.
DejavuFree open source music recognition. Downsides: you need to have your own [database].

We also found [API]s for social media, for camera, for device location, for augmented reality, for filters (deepar and banuba), for flashlight, for audio player and for machine learning (Cloudmersive, PixLab, Apache PredictionIO, Sentence Clustering API and wit.ai).


Question - What’s the best approach for the application: web or mobile?
Answer - To start we will develop a mobile application that runs on tablets. In the MVP we will enable the web capabilities and make the necessary UI adaptations.


Question - How will we store, manage and process data? And which [database]/[server] provider should we use?
Answer - This is a big one and so we will break this down into additional questions:


Question - The first question we should answer is: do we need an [SQL] or [NoSQL] [database]?
Answer - We will use a [NoSQL] [database] ([Firebase]) because it is a far superior option even with the downsides. It would require extra work to make the [database] consistent considering we need to use duplicated data, however this is not a problem in [Cloud Firestore] because [Firebase] provides many mechanisms to keep the data clean. However, once we outgrow Firestone and move to our own [server], implementing these functions to keep the data organized is a big job.
For the prototype we will use [Cloud Firestore] and switch to our own [server] hosted on [Google Cloud Platform] once we start working on the [MVP] and we will then implement all the necessary protocols to keep the data from getting messy. Here and here you can see their pricing. And here you can see [Firebase]’s platform.
Question - The second question we need to answer is: are we willing to put a lot of effort into Backend? A solid [backend] requires special care with security, infrastructure, etc. I prefer to put more effort in [frontend] and use a [BaaS], but it can get expensive over time.
Answer - We should be focused on the [frontend] and most importantly to make sure the project functions as a whole; therefore, any [BaaS] we can use will elevate us from having to focus on yet another problem. This leaves me with the same conclusion as I stated on the questions above. We start the prototype on the well-established and reliable [BaaS] [Cloud Firestore] provided by [Firebase] and switch over to our own servers and data management for the [MVP] in order to lower costs.
Question - How will our [database] be structured? How much information should it hold?
Answer - This is answered in the Data Structure section.

Question - What are the inputs and outputs of each system (apps, boards, etc.)?
Answer - We are not going to integrate with boards or any third parties in the prototype.

Question - How to connect each user that is in the stadium to the data that is going to be sent from the console (which is in that very same stadium)? The designer will be able to create something like a “room” that will be used for that specific show, and thus each user will need to connect to this “room”, regardless of the stadium?
Answer - The user doesn't need to check in, the app will check if the user's GPS location is near the arena's location.

Question - How can we connect the [DMX] (lighting controller) with the designer/user (if needed)?
Answer - We will not use DMX in the prototype.

Question - How to create the mesh of phones? How can we know how far a phone is apart from each other in every direction?
Answer - We will use inputs from the user to select which section he is currently, and based on that he will receive the combination of colors that the designer built for that section.
For the prototype we are not going to relate the distance from phones to anything.

Question - How will we synchronize the lights with the live show?
Answer - We can use existing solutions we found to identify the notes and decibels level from each part of the music; this way we can synchronize the lights. But this is experimental and we are not sure if it’s the best approach. The first solution we are going to implement is for the designer to synchronize the lights during the show adding more or less time to the sequency.