I need to see what options I can employ on my website to control Audio Hardware through coding. This is an idea or thought that was given as feedback from my tutor and course leader Julius Ayodeji.
I also need to explore whether it’s ethical to do so and what I would need to make it ethical to include it as a feature on the website.
The feasibility of implementing it into the website needs to be evaluated as well as practicality against the clients vision.
- MIDI (Musical Instrument Digital Interface)
- API (Application Programming Interface)
“Designed in the early 80’s by several music industry representatives, MIDI (short for Musical Instrument Digital Interface), is a standard communication protocol for electronic music devices. Even though other protocols, such as OSC, have been developed since then; thirty years later, MIDI is still the de-facto communication protocol for audio hardware manufacturers. You will be hard-pressed to find a modern music producer that does not own at least one MIDI device in his studio.”
The Web MDI API is not very well known among Front-end Web Developers (i’m front end) though is increasingly popular with HTML5 Game Developers. There’s a not a great deal of support for it and even then it’s only supported in Google Chrome as long as you provide a flag for it at the very top of the pages code (as shown below).
This is a method of controlling audio hardware through website coding but it’s not quite what I need for this website. I need something that does not need a MIDI Keyboard.
“Until recently, the ability to play any type of audio within a browser involved using Adobe Flash or other browser plugins. Although Adobe’s Flash player is unquestionably the most ubiquitous of these, most developers and designers would agree that it’s better not to rely on a plugin at all.
One of the most exciting and long-awaited features in HTML5 the
<audio>element, enabling native audio playback within the browser. We can take advantage of this now as all of the major browsers support it — currently Firefox, Chrome, Safari and Opera, and Internet Explorer 9+. For browsers that don’t support audio natively, we can easily fallback to Flash.”
So HTML5 audio is cutting-edge as is the coding language itself it branches from. For myself it means playing any audio from the website will more than likely have to be through HTML5.
“Currently, the HTML5 spec defines five attributes for the
src— a valid <abbr>URL</abbr> specifying the content source
autoplay— a boolean specifying whether the file should play as soon as it can
loop— a boolean specifying whether the file should be repeatedly played.
controls— a boolean specifying whether the browser should display its default media controls
preload— none / metadata / auto — where ‘metadata’ means preload just the metadata and ‘auto’ leaves the browser to decide whether to preload the whole file.”
These codes will help a lot with customising how my audio would display. So I could just choose to have media controls showing with ‘<controls></controls>’ if I code that in as one example. For the website this would be most useful for the musical scores section where I could have a play button that played along with the preview of the musical score.
So that’s fairly straight forward however the hard part would be to make the text and music coincide together which probably require a much deeper understanding of HTML5. That would require time which unfortunately I don’t have before this project is over. With more time I would pursue that feature.
Managing Audio Output Hardware on an Android/iOS Device
Trying to find information on how to manage users audio hardware for media on my website was a very difficult task indeed. Now I’m not inept by any stretch of the imagination in online research and simply found that to manage audio hardware for a user is more of a back-end developer area. I’m at most a front-end developer though mostly a designer.
I did however find information regarding how to manage audio hardware on Android/iOS devices.
So the code above essentially queries on a mobile device in an app what audio hardware a user is using. From checking what audio hardware a user uses to then managing it is a simple step.
The same can be said for iOS mobile devices and it’s a similar path to code.
Unfortunately I’m not going to being developing the website for mobile devices in that detail to include audio hardware queries and triggers. I will not add those queries+triggers because the client does not want them and really he does not need them.
I do not need the Web MIDI API either as my clients will not be using a MIDI Keyboard but it was fascinating to understanding how that works and that it’s an ever growing popular form of audio web development.
The main research that I’ll take away with something for this project is the HTML5 audio web elements. The 5 attributes that I uncovered are just some of the useful audio attributes I could use on the musical scores section.
Though I failed to find anything that would be able to check or change users audio hardware from code that I could put into the website. It was a fascinating and informative journey that I took to look through some of the possibilities of manipulating/including audio onto a website though I failed to find what I was initially looking for.
The only research I found that came close was manipulating mobile audio hardware for Android/iOS devices.