Categories
Uncategorized

Kero and Defasten made our virus dreams into a futuristic music video (CDM premieres)

What do you do when you can’t get a virus off your mind? Channel that into a bioscience audio-vision of immunity that reflects that new reality. We talk to Defasten and Kero about their music video for “Lodge.”

We need this sort of fantasy and escape now, I think. But do also check in on the reality in Detroit and the USA – to all our colleagues and friends and family there, we are with you, from Berlin to LA and around the world.

Highways is the kind of EP that might soothe your mood now – it’s a pulsing, electronic, unfamiliar world, but somehow comfortable. It’s music to disinfect to – dry, irregular acid lines, asymmetrical rhythms, but then mellow harmonies set against them. “Chrysler” sounds like a floating Detroit concept car, after hours, a stylish opener punctuated by a wonderful, bizarre bass line. “Southfield” is urgent and groovy; “Fisher” a growling post-apocalyptic IDM deconstructed-electro. “Davison” is delightfully weird reserved glitch. This is Michigan, yes, but through some Tron filter – enter the sadistic game grid.

“Lodge” rounds out the release, and it’s to me the most ambitious – and striking – culmination of Kero’s concept here. Its abstract cycle never quite materializes, a stuttering sound sculpture trying to escape an Enterprise transporter pattern buffer, but with beautiful, murky pad clusters breathing in and out in the background.

That music is evocative even if you close your eyes, but Defasten gives us a bio-science concept visual – unsettling but eerily pretty turquoise and purple 3D imagery. Watch:

Video: www.defasten.com

This is built in Notch, the same 3D software Ted Pallas used in the xR experiments I wrote about yesterday. (Ted reviewed this 3D software for CDM – I’m editing that review now.)

Here’s what Kero and Defasten had to say about their work here.

Peter: Want to say anything about the music, sounds? Love the vibe so – could be either the gear or the feeling you had, or both?

I have always had a love for futuristic GUIs in an ode to classic Miami Vice 80s and futurism / cyberpunk aesthetics. Defasten basically just worked with the aesthetic concepts he knew I already loved, but added a few elements of medical and scientific imagery that we both felt inspired by the worlds current crisis.

Kero’s rig for this release: Eurorack modular (of course, classic Doepfer stuff at top anchoring the setup), Elektron Analog Rytm MKII, Elektron Analog Four MKII, and Teenage Engineering OP-Z, which is nearly camouflaged in gray against its larger fellow Swedish gear.

For both of you – I mean, is there something cathartic and calming about really diving into the science here, understanding what this thing is we’re up against? Or how did you feel about the viral content?

Patrick from Defasten: The Lodge video visibly is a comment on what we’re experiencing right now globally. I found the microsounds of the track to evoke the biology of our bodies, the microscopic world that constitutes our being. It was of interest to interpret this graphically, with the real-time synthetic imagery discreetly reacting to Kero’s sonic pulses.

That said, there is indeed something calming when you’re focusing on crafting an idea, isolated at home. I think a lot of academics and researchers of any field can relate to the isolation required to develop an idea. This is the strange calm required to quell the storm.

CDM: To ask another question, is there a feeling of being on the side of science and technology as an approach, not just sort of giving into 1918 chaos?

Patrick: Let’s hope we don’t give into the 1918 chaos. Looking at the numbers now, we are not experiencing the loss of life at such magnitude. We are grateful that science and technology and general quality of life has increased since the last 100 years. That said, massive loss of life in 2020 is still to be taken very seriously.

But yes – I am on the side of science and technology to combat the pandemic, in addition to the cooperation of everyone to respect the temporary measures in place in reducing the spread of the coronavirus. In 1918, they didn’t have the internet – we now have this luxury to have the latest info – either fake or real – relayed to our phones. In many ways, we’re equipped to handle a pandemic, however it doesn’t mean we should put all our faith in science – in reality, politics has played a huge role in the pandemic’s acceleration, and science only responds as a result.

Can you talk about how this collaboration came together? Obviously there’s tons of visual-sonic collaboration and boundary pushing on DU.

PD: It came out quite spontaneously. I’ve already done more than a few works for Kero’s label since a few years now, I think we understand each other musically/aesthetically and are generally in the same zone with our tastes and interests. So once again – he gave me carte blanche to design what I felt was right. This kind of creative collaboration is what I value most – when there is solid trust in the members involved, and that the constructive dialog between the creatives enhance the process.

What’s next for this project and others? Will we see this bio-future-opera expanded?

PD: I’m interested in exploring the themes explored in the video further, but is it really a ‘’bio-future-opera’’? That is up to the public to decide. Prior to the pandemic – I was already interested in the intersection of biology, technology, the need to improve/augment our bodies, and the innerworld that is within us all. I think, instead of this over-emphasis on AI we’ve seen in the 2010s – the time is ripe for the creative and tech industries to re-examine itself, expand its interests, and push towards a new awareness and understanding of what is already all around us – not only to gain immunity to timeless viruses, but to understand/unlock the secrets of the microscopic world which we rely on to be alive, and of course to respect its boundaries. This of course will not be a smooth journey.

How do you hope people will watch this? I turned out the lights and went VERY full screen in the dark. But with all these streams around, I wonder if you have a vision for how we can have some more, say, quality immersion.

PD: What you did sounds like a good idea. The video is a slow burner and doesn’t require your constant attention – watch it on your phone if you like. There’s a lot of micro detail, so the higher the screen resolution, the better. The original content was made in Notch, so it’s generated in real-time, and it could loop forever, so ideally – a multi-screen installation setup running on real-time data in a large, dimly lit, architectural space. Sound familiar? 🙂 I also see it as a kind of backdrop to a sober, advanced tech ‘’mission briefing’’, in a large auditorium or hangar, with speakers of various expertise explaining to an audience the stakes at hand. Like a TED Talk PowerPoint presentation replaced with a holographic visual data presence.

The latest from the Detroit area, on the front lines

Next up from Detroit Underground is Joe Sousa, who will be out next on audio tape, Infinite Cold Distance. He had these sobering words to share about the current situation in Michigan; he’s a respiratory therapist by day so right out there.

Joe, you stay safe, too – and thanks for the update, especially as we deal with this worldwide. We can’t wait to hear your music.

Covid-19, week 3 southeast Michigan update:

This is going to be an account of my experiences during the boom of this virus, and to the point.

I was the charge therapist staffed during the initial weekend of SARS-CoV-2 infected admissions. Admittedly, a time of extreme uncertainty, with high anxieties felt across every profession in my building. You could feel it as you walked into rooms, as you talked to providers, as you tried to guide those around you with what knowledge you had. I am at a minimum pleased to say that phase has passed.

While my hospital (40 minutes away from epicenters such as Detroit or central Oakland County) is not yet challenged with at-capacity status, we are faced with the highest acute workload we have ever seen as a respiratory department. We have more ventilated patients than I’ve seen in my 8 year tenure. The patient population who requires critical care typically has at least one other health issue, but not all. Age is barely a factor, but most people are between 40 and 80. That being said, there has been individuals who are in their earlier 20s, and let that be a reality check.

Regarding diagnosis and treatment: this is an ever-evolving beast. Too many unknowns, and much is questioned daily, not only based on my personal research but based on conversation with providers across the field. Those with hypertension, diabetes, and ACE inhibitor usage seem to be at highest risk. Conjecture between cytokine storm, vasculopathy, thromboembolism, and more, point to an atypical presentation of ARDS. Some things I’ve read are saying it’s actually not true ARDS at all. Many lung mechanic strategies we implement end up being similar, but there is so much food for thought that I’m kept with a consistently open mind. For the layman: this is a unique virus. Too new (“Novel”), and lots to study. Full disclosure: I am not a physician or an infectious disease specialist for example, but knowing how others think is important as a clinician; to best integrate my respiratory tactic into the care plan. I put this here just as an insight to perspective on how every frontline team member is integrally involved in outcomes.

Regarding PPE: we as an institution early-implemented conservative measures, so we are not yet on the shortage side of the line. I anticipate this happening, though, if the supply chain has not been figured out by now. I also don’t believe in a first world country we should have to worry about “conserving” single-use protective equipment, so that thought is slightly daunting. Key point: I take my time putting on my gear and taking it off, properly, effectively, safely. It goes without saying that we, as the health care workforce, are exponentially more at risk than anyone social distancing or locked in their homes.

Beyond this, I truly feel for my other southeast Michigan hospitals nearing or at capacity. Michigan is currently at the 4th highest case count, and the 3rd highest deaths. And in the spirit of honesty: when they do, the deaths come swiftly. It is taxing to all of us, mentally and emotionally. I’m not a proponent of fear, but please stay home. Please be clean. Keep up your immunity and cardiopulmonary system with proper diet and exercise. Please take my word for it. And probably best to lay off the NSAIDS for a while.

Finally, there has been a token of uplifting measure: to see the level of support from so many friends and family of mine, checking in, giving thanks, it’s just all extremely encouraging. This matters more than you know, and I know my peers feel the same. Thank you to all the restaurants and individuals who donate meals and endless snacks to our ICU units, and while these aren’t alway the healthiest options, it does plenty for morale.

Listen to facts from experts, not headlines in the news. Knowledge is power more than ever, and I hope it quells some stress for some of you, I know it does for me. Stay safe and diligent everyone. This too shall pass.

Kero’s release and videos

Well, we’ll need something while we’re home. So for those of you who can, go get that record, which comes adorned with fantastic urban topography from Berlin’s graphic design shop www.neubauberlin.com, pressed in Detroit at Archer.

And for everybody, we get some eye candy. Dim the lights. Start with the opener:

Another one by Katya Ganya, for “Fisher”:

And Bryant CPU Place [www.cyberpatrolunit.com], for “Southfield”:

The post Kero and Defasten made our virus dreams into a futuristic music video (CDM premieres) appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/kero-and-defasten-virus-dreams/
via IFTTT

Categories
Uncategorized

Control free streaming tool OBS Studio with OSC – and more essential tricks

Control live streaming and recording tool OBS Studio with other apps and tools, and route video live. Free add-ons make it all possible.

Keep in mind this isn’t just for the live streaming craze – it’s for recording, too. But if you’re going to stream, by all means, do something interesting.

Carlo Cattano has made a free tool with some major implications – and it’s simple enough that it’s also a nice demo of how to write this in Python, generally. This code lets you route Open Sound Control – the high-res, open communication protocol used by many VJ apps, touch apps on iOS, and other applications – into OBS Studio:

Control OBS Studio with Open Sound Control template example [https://github.com/CarloCattano/ObSC]

That opens up all sorts of possibilities – script and automate video switching, jam live with the input, automate screencasts and recording, and more. In action: well, blink-y viral action:

Also useful in OBS – you can route input from other applications directly.

On the Mac, you can use Syphon, open tech that lets you route 3D textures in OpenGL as easily between apps as you might audio signal in a patch bay. Just plan to stick to versions earlier than Mojave; Mojave and Catalina break support – see this article. (But you can substitute NDI – see below.)

Using this sort of mapping (Syphon, NDI, etc.), you might even go the opposite direction – using this as output to mapping, for example:

On Windows, there’s Spout2 support (the Windows DirectX 11 equivalent of Syphon):

github.com/Off-World-Live/obs-spout2-source-plugin

For an example of what this is for, here’s someone recording live visuals – alongside Ableton Live – using OBS and Spout. And this is from 2017, so again, it’s not just about live streaming during the pandemic.

And across platforms, you can use obs-ndi, which support’s NewTek’s NDI for networked audiovisual support:

github.com/Palakis/obs-ndi

That’s useful,, because it lets you freely specify sources, outputs, and filters using OBS over a network.

Streamers – and gamers in particular – have been using this already to use phones as remote cameras and perform multiple computer streaming.

You can even use it to save using a capture card:

More tips:

And yes, you could also use NDI to build your own switcher using something like TouchDesigner:

Full tutorial:

BUILD A NDI SWITCHER IN TOUCHDESIGNER 099 [mxav.net]

So there you have it. Let other people keep running horrible sound from their phone, while you use OBS as an all-purpose tool for routing, switching, capturing, and streaming video. Oh yeah and – you can use all of this to make your phone a capture, while using your computer to make light work of streaming/recording audio feeds and mic in high quality.

And the essential glue here is all free.

That means all of this streaming craze is a perfectly reasonable time for the rest of us to hone some of our video chops, whether we’re musicians or visualists. So hope you’re staying safe at home, and happily patching video switchers any time the news makes you a bit too anxious. At least … that’s part of my plan, for sure. Best to all of you and – yes, you can actually invite me to your streams.

The post Control free streaming tool OBS Studio with OSC – and more essential tricks appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/control-free-streaming-tool-obs-studio-with-osc-and-more-essential-tricks/
via IFTTT

Categories
Uncategorized

Control free streaming tool OBS Studio with OSC – and more essential tricks

Control live streaming and recording tool OBS Studio with other apps and tools, and route video live. Free add-ons make it all possible.

Keep in mind this isn’t just for the live streaming craze – it’s for recording, too. But if you’re going to stream, by all means, do something interesting.

Carlo Cattano has made a free tool with some major implications – and it’s simple enough that it’s also a nice demo of how to write this in Python, generally. This code lets you route Open Sound Control – the high-res, open communication protocol used by many VJ apps, touch apps on iOS, and other applications – into OBS Studio:

Control OBS Studio with Open Sound Control template example [https://github.com/CarloCattano/ObSC]

That opens up all sorts of possibilities – script and automate video switching, jam live with the input, automate screencasts and recording, and more. In action: well, blink-y viral action:

Also useful in OBS – you can route input from other applications directly.

On the Mac, you can use Syphon, open tech that lets you route 3D textures in OpenGL as easily between apps as you might audio signal in a patch bay. That’s native in the latest OBS release.

By the way you might even go the opposite direction – using this as output to mapping, for example:

On Windows, there’s Spout2 support (the Windows DirectX 11 equivalent of Syphon):

github.com/Off-World-Live/obs-spout2-source-plugin

For an example of what this is for, here’s someone recording live visuals – alongside Ableton Live – using OBS and Spout. And this is from 2017, so again, it’s not just about live streaming during the pandemic.

And across platforms, you can use obs-ndi, which support’s NewTek’s NDI for networked audiovisual support:

github.com/Palakis/obs-ndi

That’s useful,, because it lets you freely specify sources, outputs, and filters using OBS over a network.

Streamers – and gamers in particular – have been using this already to use phones as remote cameras and perform multiple computer streaming.

You can even use it to save using a capture card:

More tips:

And yes, you could also use NDI to build your own switcher using something like TouchDesigner:

Full tutorial:

BUILD A NDI SWITCHER IN TOUCHDESIGNER 099 [mxav.net]

So there you have it. Let other people keep running horrible sound from their phone, while you use OBS as an all-purpose tool for routing, switching, capturing, and streaming video. Oh yeah and – you can use all of this to make your phone a capture, while using your computer to make light work of streaming/recording audio feeds and mic in high quality.

And the essential glue here is all free.

That means all of this streaming craze is a perfectly reasonable time for the rest of us to hone some of our video chops, whether we’re musicians or visualists. So hope you’re staying safe at home, and happily patching video switchers any time the news makes you a bit too anxious. At least … that’s part of my plan, for sure. Best to all of you and – yes, you can actually invite me to your streams.

The post Control free streaming tool OBS Studio with OSC – and more essential tricks appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/control-free-streaming-tool-obs-studio-with-osc-and-more-essential-tricks/
via IFTTT

Categories
Uncategorized

Control free streaming tool OBS Studio with OSC – and more essential tricks

Control live streaming and recording tool OBS Studio with other apps and tools, and route video live. Free add-ons make it all possible.

Keep in mind this isn’t just for the live streaming craze – it’s for recording, too. But if you’re going to stream, by all means, do something interesting.

Carlo Cattano has made a free tool with some major implications – and it’s simple enough that it’s also a nice demo of how to write this in Python, generally. This code lets you route Open Sound Control – the high-res, open communication protocol used by many VJ apps, touch apps on iOS, and other applications – into OBS Studio:

Control OBS Studio with Open Sound Control template example [https://github.com/CarloCattano/ObSC]

That opens up all sorts of possibilities – script and automate video switching, jam live with the input, automate screencasts and recording, and more. In action: well, blink-y viral action:

Also useful in OBS – you can route input from other applications directly.

On the Mac, you can use Syphon, open tech that lets you route 3D textures in OpenGL as easily between apps as you might audio signal in a patch bay. That’s native in the latest OBS release.

By the way you might even go the opposite direction – using this as output to mapping, for example:

On Windows, there’s Spout2 support (the Windows DirectX 11 equivalent of Syphon):

github.com/Off-World-Live/obs-spout2-source-plugin

For an example of what this is for, here’s someone recording live visuals – alongside Ableton Live – using OBS and Spout. And this is from 2017, so again, it’s not just about live streaming during the pandemic.

And across platforms, you can use obs-ndi, which support’s NewTek’s NDI for networked audiovisual support:

github.com/Palakis/obs-ndi

That’s useful,, because it lets you freely specify sources, outputs, and filters using OBS over a network.

Streamers – and gamers in particular – have been using this already to use phones as remote cameras and perform multiple computer streaming.

You can even use it to save using a capture card:

More tips:

And yes, you could also use NDI to build your own switcher using something like TouchDesigner:

Full tutorial:

BUILD A NDI SWITCHER IN TOUCHDESIGNER 099 [mxav.net]

So there you have it. Let other people keep running horrible sound from their phone, while you use OBS as an all-purpose tool for routing, switching, capturing, and streaming video. Oh yeah and – you can use all of this to make your phone a capture, while using your computer to make light work of streaming/recording audio feeds and mic in high quality.

And the essential glue here is all free.

That means all of this streaming craze is a perfectly reasonable time for the rest of us to hone some of our video chops, whether we’re musicians or visualists. So hope you’re staying safe at home, and happily patching video switchers any time the news makes you a bit too anxious. At least … that’s part of my plan, for sure. Best to all of you and – yes, you can actually invite me to your streams.

The post Control free streaming tool OBS Studio with OSC – and more essential tricks appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/control-free-streaming-tool-obs-studio-with-osc-and-more-essential-tricks/
via IFTTT

Categories
Uncategorized

Uganda’s Afrorack goes from modular synths to a DIY disinfectant; more efforts worldwide

Brian Bamanya made a name making DIY modular synths, but now he’s applying voltage to another task – making sodium hypochlorite (aka bleach). Science! That joins a growing number of efforts of DIYers turning to fight the pandemic head-on.

Please, do not try anything like this before reading advisories below.

First off, this stuff is what’s known as household bleach or liquid bleach. Despite the fact that it’s sold readily, it is potentially very toxic – don’t let it touch other cleaning substances based on ammonia and acidic cleaners, for instance, or you’ll brew some harmful fumes. In fact, don’t even leave it sunlight. (Here’s a list of don’ts.) Don’t drink it, obviously (okay, not obvious to some), but also don’t let it touch anything that you’re going to consume – don’t get this anywhere near food.

But used with care, bleach is fantastic. You’ll see it in the toolkits of professional cleaners for a reason – it’s good at certain tasks. And it is very effective on surfaces against SARS-CoV-2, that virus known as the coron— yeah, I know, you hear about it every 15 seconds. Let’s get back to bleach and chemistry, because they’re cool.

But the important thing here is – yes, this can produce a WHO-approved surface cleaner. And no, you should not take any advice in chemistry or health from CDM. Honestly, I’m not sure I would claim you should take synth advice from CDM. Here are reliable sources on bleach and SARS-CoV-2:

World Health Organization on disinfecting [WHO PDF]
COVID-19 – Disinfecting with Bleach [Michigan State University]
National Center for Biotechnology Information on bleach specifically [they’re part of the National Institutes of Health, a US government branch]
Environmental Protection Agency document on the topic

Brian’s approach leans as much on electronics background as it does chemistry, because you can make it by running electricity through sodium chloride salt solution. Yeah – it’s analog. And that’s how it is manufactured.

What Brian is doing that’s clever is making this on a small scale when industrially-produced material has been subject to price hikes – and reusing plastic bottle trash in the process.

Is this a good idea? I don’t want to comment, as I am neither an expert on infectious disease nor anything like a chemist. So I want to put it out there to hear reaction, as normally given the range of backgrounds on the site, someone has an answer. I’ll update this story and our social channels with whatever we hear.

You can support the project here:

gogetfunding.com/diy-sanitizer/

And find Brian here:

Bleach is effective in small concentrations; alcohol requires greater purity. But theoretically it should be possible to DIY ethanol alcohol, and off-the-grid types have been doing that before the COVID-19 outbreak. Also, unlike distillation, this will be legal in most places – though be careful not to sell it or make health claims, as that requires a license.

Let me again restate that I am not in any way qualified to talk about this, and you should not listen to me, though you should get in touch if you are qualified, and it is worth reading the experts – if for no other reason than to pass the time.

More efforts from the music makers

It’s also an indication of the changed world we’re in that the synth DIY community in general is in some cases turning to things other than musical instruments.

From Slovakia, Jonáš Gruska of LOM label – an experimental music label and maker of various sound electronics – is one of many people making 3D-printed face masks. (He’s also experimenting with UV hardware, but the face masks I know are being actively advocated by health care professionals around the world for their supplies.)

Groups like NYCResistor, who had been a partner of ours back in NYC, are engaged in similar projects – though the calls are as diverse as places looking for plexiglass boxes for intubation equipment.

Our friend Geert Bevin now of Moog has been making protective gear with UNC Asheville students working at the STEAM Studio:

UNCA students help make protective gear for health care workers [WLOS news]

People are sewing cloth masks, too – originally specifically excluded from guidance, but now part of international recommendations as the contagion and our knowledge of it evolve. Take for instance SewnMasksNYC, and (too many to list here) various efforts undertaken by musicians and media artists in our circle.

Places to find DIY help

I’ll refer to the official US Center for Disease Control instructions here (English + Spanish), just posted as they updated their guidance to begin advocating them. After some mixed messages here, this document is clear and concise and applicable everywhere – uh, once you convert from inches. (Some day, my native country will go metric.)

Use of Cloth Face Coverings to Help Slow the Spread of COVID-19 [CDC]

You’ll also find active open source groups for equipment. The main hub is currently on Facebook:

www.facebook.com/groups/opensourcecovid19medicalsupplies/

With a preferred 3D-printed face shield plan living at:

www.prusaprinters.org/prints/25857-prusa-protective-face-shield-rc3

And here’s some music to accompany this article, by Ana Quiroga as NWRMNTC, who I understand has been sewing masks together with curator/artist Estela Oliva in the UK:

We needed some music, for sure, somewhere in this.

Let us know your feedback and what you may be involved in. I certainly don’t mean to intend that everyone in our community needs to contribute in this way – staying at home or doing your day job may be your best bet, and there’s plenty that matters in music itself these days. But I do hope we can use our networks to stay informed and connected.

The post Uganda’s Afrorack goes from modular synths to a DIY disinfectant; more efforts worldwide appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/uganda-afrorack-diy-disinfectant-and-efforts-worldwide/
via IFTTT

Categories
Uncategorized

Holodeck DJ: I played techno on an XR stage – here’s what it was like

There are cameras. There’s video and 3D. What happens when you create a futuristic mixed reality space that combines them, live? I headed to a cavernous northern New Jersey warehouse to find out.

With or without the pandemic crisis, our lives in the digital age straddle physical and imagined, meatspace and electronic worlds. XR represents a collection of current techniques to mediate between these. Cross or mixed is a way to play in the worlds between what’s on screen or video and what exists in physical space.

Now, with all these webcasts and video conferencing that have become the norm, the reality of mixing these media is thrown into relief in the mainstream public imagination. There’s the physical – you’re still a person in a room. Then there’s the virtual – maybe your appearance, and the appearance of your physical room, is actually not the thing you want to express. And between lies a gap – even with a camera, the viewpoint is its own virtual version of your space, different than the way we see when we’re in the same space with another person. XR the buzzword can melt away, and you begin to see it as a toolkit for exploring alternatives to the simple, single optical camera point of view.

To experience first-hand what this might mean for playing music, I decided to get myself physically to Secaucus (earlier in March, when such things were not yet entirely inadvisable). Secaucus itself lies in a liminal space of New Jersey that exists between the distant realities of the Newark International Airport, the New Jersey Turnpike, and Manhattan.

Tucked into a small entrace to a nondescript, low-slung beige building, WorldStage hides one of the biggest event resources on the eastern seaboard. Their facility holds an expert team of AV engineers backed by a gargantuan treasure trove of lighting, video, and theatrical gear. Edgewater-based artist/engineer Ted Pallas and his creative agency Savages have partnered with their uniquely advanced setup to realize new XR possibilities.

“Digital artists collaborating with this new technology pave the road for where xR can go,” says Shelly Sabel, WorldStage’s Director of Design. “Giving content creators like Savages opportunities to play on the xR stage helps us understand the potential and continue in this new direction.”

I was the guinea pig in experimenting with how this might work with a live artist. The mission: get out of a Lyft from the airport, minimizing social contact, unpack my backpack of live gear (VCV Rack and a mic and controller), and try jamming on an XR stage – no rehearsal, no excuses. It really did feel like stepping onto a Holodeck program and playing some techno.

And I do mean stage. The first thing I found was a decent-sized surface, LEDs on the floor, a grid of moving head lights above, and over-sized fine-grade LED tiles as a backdrop on two sides. Count this as a seven-figure array of gear powering a high-end event stage.

The virtual magic is all about transforming that conventional stage with software. It’s nothing if not the latest digital expression of Neo-Baroque aesthetics and illusion – trompe-l’œil projection in real space, blended with a second layer of deception as that real-world LED wall imagery is extended in virtual space on the computer for a seamless, immersive picture.

It’s a very different feeling than being on a green screen or doing chroma key. You look behind you and you see the arches of the architecture Ted and his team have cooked up; the illusion is already real onstage. And that reality pulls the product out of the uncanny valley back into something your brain can process. It’s light years away from the weather reporter / 80s music video cheesiness of keying.

I’m a big believer in hacking together trial runs and proofs of concept, so fortunately, Ted and team were, too – as I was the first to try out this XR setup in this way. He tells CDM:

This was our first time having an artist in one of our xR environments, in a specific performance context – we’d previously had some come visit, but Peter is the first to bring his process into the picture. As such, we decided to keep things mellow – there was a lot of integration getting blessed as “stable” for the first time, and I wanted to minimize the potential for crashing during the performance – my strong preference is to do performances in one take.

The effects you’ll see in the video are pretty simple and subtle by design. Plus I was entirely improvising – I had no idea what I would walk onto in advance, really. But the experience already had my head reeling with possibilities. From here, you can certainly add additional layers of augmentation – mapping motion graphics to the space in three dimensions, for instance – but we kept to the background for this first experiment.

Just as in any layered illusion, there’s some substantial coordination work to be done. The Savages team are roping together a number of tools – tools which are not necessarily engineered to run together in this way.

The basic ingredients:

Stype – camera tracking
disguise gx 2c – media server (optimized for Notch)
Notch – real-time content hosted natively in disguise media software
Unreal Engine – running on a second machine feeding disguise
BOXX hardware for Unreal, running RTX 6000 GPUs from NVIDIA
SideFX Houdini software for visual effects

The view from Notch.

Camera tracking is essential – in order to extend the optically-captured imagery with virtual imagery as if it were in-camera, it’s necessary for each tiny camera move to be tracked in real time. You can see the precision partly in things like camera vibrations – the tiniest quiver has a corresponding move in the virtual video. Your first reaction may actually be that it’s unimpressive, but that’s the point – your eye accepts what it sees as real, even when it isn’t.

Media servers are normally tasked with just spitting out video. Here, disguise is processing data and output mapping at the same time as it is crunching video signal – hiding the seams between Stype camera tracking data and video – and then passing that control data on to Notch and Unreal Engine so they’re calibrated, too. It erases the gap between the physical, optical camera and the simulated computer one.

Those of you who do follow this kind of setup – Ted notes that disguise is instancing Notch directly on its timeline, while Unreal is being hosted on that outboard BOXX server. And the point, he says, is flexibility – because this is virtual, generative architecture. He explains:

All about the parameters.

Apart from the screen surface in the first set, all geometry was instanced and specified inside of the Unreal Engine via studio-built Houdini Digital Assets. HDAs allow Houdini to express itself in other pieces of software via the Houdini Engine – instead of importing finished geometry, we import the concept of finished geometry and specify it within the project, usually looking through the point of view of the [virtual 3d] camera.

This is similar in concept to a composer writing a very specific score for an unknown synthesizer, and then working out a patch with a performer specific to a performance. It’s a very powerful way to think about geometry from the perspective of the studio. Instead of worrying about finishing during the most expensive part of our process time-wise — the part that uses Houdini — we buffer off worrying about finishing until we are considering a render. This is our approach to building out our digital backlot.

The “concept of the geometry” – think a model for what that geometry will be, parameterized. There’s that Holodeck aspect again – you’re free to play around with what appears in virtual space.

Set pieces in Houdini.

There are two set pieces here as demo. I actually quite liked the simple first set, even, to which they mapped a Minimoog picture on the fly – partly because it really looks like I’m on some giant synth conference stage in a world that doesn’t yet exist. Ted describes the set:

The first set is purposefully pedestrian – in as little time as possible, we took a screen layout drawing for an existing show, added a bit of brand-relevant scenic, and chucked it in a Notch block. The name of the game here was speed – start to finish production time was about three hours. On the one hand, it looks it. On the other hand, this is the cheapest possible path to authoring content for xR – treat it like you’re making a stage, and then map it from the media server like it’s a screen. What’s on the screen can even be someone else’s problem, allowing digital media people to masquerade as scenic and lighting designers.

The second piece is more ambitious – and it lets a crew transport an artist to a genuinely new location:

Inside the layers of Savages’ virtual architecture.

The second set design was inspired by architect Ricardo Bofill’s project La Muralla Roja. As the world was gearing up to shutdown, we spent a lot of time discussing community. La Muralla Rojo was built to challenge modern perspectives of public and private spaces. Our Muralla is intended to do the same. We see it as a set for multiple performers, each with their own “staged location” or as a tool to support a single performer.  

Courtesy Ricardo Bofill, architects – see the full project page (and prepare to get lost in photos transporting you to the North African Mediterranean for a while).

And yes, placing an artist (that’ll be me, bear with me here) – that adds an additional layer to the process. Ted says:

[Bofill’s] language for the site is built out of plaster and the profile of a set of stairs, modulated by perpendicularity and level. An artist standing on [our] LED cube is modulating a perpendicular set of surfaces by adding levels of depth to the composition.

This struck me as a good peg for us all to use to hang our hats. Without you [Peter] standing there, the screens are very flat – no matter how much depth is in the image. :ikewise, without the stairs, muralla roja would be very flat. when i was looking for references this is what struck me.

It may not be apparent, but there is a lot still to be explored here. Because the graphics are generative and real-time, we could develop entire AV shows that make the visuals as performative of the sound, or even directly link the two. We could use that to produce a virtual performance (ideal for quarantine times), but also extend what’s possible in a live performance. We could blur the boundary between a game and a stage performance.

It’s basically a special effect as a performance. And that opens up new possibilities for the performer. So here I was pretty occupied just playing live, but now having dipped in these waters the first time, of course I’m eager to re-imagine the performance for this context – since the set I played here is really just conceived as something that fits into a (real world) DJ booth or stage area.

Ted and Savages continue to develop new techniques for combining software, including getting live MIDI control into the environment. So we’ll have more to look at soon.

To me, the pandemic experience is humbling partly in that it reminds us that many audiences can’t physically attend performances. It also reveals how virtual a lot of our connections were even before they were forced to be that way – and reveals some of the weakness of our technologies for communicating with each other in that virtual space. So to sound one hopeful note, I think that doubling down on figuring out how XR technologies work is a way for us to be more aware of our presence and how to make the most of it. Our distance now is necessary to save lives; figuring out how to bridge that distance is an extreme but essential way to develop skills we may need in the future.

Full set:

Artist: Peter Kirn
Designer (Scenography, Lighting, VFX): Ted Pallas, Savages
Director of Photography: Art Jones
Creative Director: Alex Hartman, Savages
Technical Director: Michael Kohler, WorldStage

www.savag.es

www.worldstage.com

Footnote: If you’re interested in exploring XR, there’s an open call out now for the GAMMA_LAB XR laboratory my friends and partners are running in St. Petersburg, Russia. Fittingly, they have adapted the format to allow virtual presence, allowing the event itself to go on., and it will bring some leading figures in this field It’s another way worlds are coming together – including Russia and the international scene.

Gamma_LAB XR [Facebook event / open call information in Russian and English]

The post Holodeck DJ: I played techno on an XR stage – here’s what it was like appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/holodeck-dj-xr-stage-techno/
via IFTTT

Categories
Uncategorized

Begone, webcams: Dixon will premiere an album in gorgeous 3D mixed reality, today

We’ve gazed into grainy video feeds and literally watched multi-camera shoots of empty clubs. But we’re also starting to see a move into futuristic 3D and mixed reality – starting with one unique album premiere from the Innervisions mastermind.

The feed is tonight Berlin time, that’s 10PM CET so 4PM NYC time, 1PM California. (And yeah, my heart is with you right now, America, even as I type the letters NYC – and many other places worldwide. I know this is beyond tough; I’m watching and listening. If you want to join for Dixon distraction in 3D, please say hi.)

In a global crisis, one key element to look for in culture is people who were working on something before all of this, and that might endure through and after. So Dixon qualifies. He already DJed virtually (thanks to motion capture, a collaboration with Rockstar Games, and an appearance in Grand Theft Auto V Online, which you can still go visit in the game). And he had unveiled his Transmoderna project, which at least had the lofty goal of turning a club into something immersive. It’s hard to untangle what that means from the description, and I don’t tend to hang out in Ibiza. It seems in PR materials, everything in Ibiza starts to turn into some high-concept club-marketing gobbledygook – but yeah, his residency hosting everyone from the Innervisions crew to Mano le Tough to Honey Dijon also had the notion of developing immersive technology and reimagining what a club was.

Let’s skip to what is actually happening now, tonight Berlin time, as it at least starts to plumb this question of “what could be on a video feed that isn’t just a camera pointed at a bunch of clubgoers?”

The clubgoers for something like Boiler Room are now legally removed, their absence mandated by German contact limits in this pandemic. But the supercomputer on your desk and the supercomputer in your pocket were already capable of doing other things.

So Dixon tonight will perform a mixed reality DJ set as part of the Transmoderna collaboration, and to debut unreleased music of the same name. The artist says he’s building his Dj set already from this material. (No word on who yet, but previously announced collaborations on this moniker included Âme, Mathew Jonson, Echonomist, Trikk, Frank Wiedemann, and Roman Flügel.)

It’s really the visual material that starts to show promise, though – see the images here. Bleeding-edge visualist studio SELAM X has created an alien fishtank for the artist to inhabit tonight – and you can see the wonderful creatures they’re producing.

The whole thing will run in a game engine – fitting, as platforms like Twitch were streaming games while the club community was still just showing, well, clubs. And beyond that, I don’t know what to expect. But I’ll be tuning in, as this feels less like “DJ mix plus webcam” and more like something worth seeing on a screen.

I hope to talk to one of the artists at SELAM X soon, so take a look, let us know what you think, and if you have questions.

But I do suspect there’s a lot of potential here. And hey, if you want to catch Dixon and The Black Madonna in GTA V, too, I’m game. It’s more fun than watching Facebook Live chat hang my browser tabs, I know that. (Hey – I believe in computers and the Internet. We will get there, because we can.)

www.instagram.com/selamxstudio/

The post Begone, webcams: Dixon will premiere an album in gorgeous 3D mixed reality, today appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/dixon-live-3d/
via IFTTT

Categories
Uncategorized

Streamlabs is an easier, free all-in-one streaming app, now on Mac, Windows, iOS, and Android

Start with OBS, the now industry-standard streaming app, and add a bunch of special sauce to make it easier and friendlier. Now you’ve got Streamlabs – and it just added Mac support to its other platforms.

Mention live streaming any time in the past year or so, and someone no doubt told you to use OBS. Open Broadcaster Software, aka OBS Studio, is indeed free and powerful – not only for streaming but live recording, too. (It quietly displaced a lot of pricey and often incomplete commercial screencasting software, too.)

OBS has gotten a lot easier – a cash infusion from Twitch, Facebook, NVIDIA, and Logitech no doubt helped. But it’s still a bit intimidating as far as configuring settings for recording, to say nothing of the manual settings required to then make it upload to various streaming platforms.

That’s where Streamlabs comes in. It’s got its own desktop apps based on OBS, plus apps that let you easily stream from Android and iOS, too. So while you could do all of this on OBS desktop, Streamlabs makes it easier – basically, it’s a bit like having a custom distro of OBS. And then by adding mobile access, those platforms become easier, too.

Looks like OBS – but 100% less intimidating.

So in addition to all the things that make OBS powerful – using any video source or onscreen inputs, switching between them, handling resolutions and recording as well as connecting, you get:

  • Pre-configured streaming platforms and easy login (think YouTube, Twitch, Facebook, etc.)
  • Auto-optimized video settings
  • Custom alerts (so you can also beg for donations, add engagement)
  • Themes and widgets for customizing your stream
  • Built-in chat (normally requiring you to open another window in OBS, which gets surprisingly clumsy fast)
  • Easy recording
  • Cloud backups (so you don’t lose your recording)

streamlabs.com

Honestly, having played around with it a bit, maybe the best part of Streamlabs is that all the power of OBS is there, but easier to use. So it doesn’t feel like a dumbed-down version of OBS so much as a polished, beginner-friendly interface with all the same features – and some useful additions.

The easier-to-follow Sources dialog alone is probably worth the price of admission. And price of admission is free, anyway.

The mobile apps also feature a lot of nice integrations on these lines, too. Think similar cross-platform streaming support, importing OBS settings from desktop, and adding widgets for events, donations, and chat.

streamlabs.com/mobile-app

The spin here of OBS is open source, like its sibling. It’s based on Electron, so I hope that now that macOS was added, we’ll see Linux, too. Linux users should meanwhile note that OBS packaging has improved a lot across distros, and Ubuntu Studio for instance even bakes a pre-configured OBS right into the OS. I have no idea how much work would be required to do the same with Streamlabs. (PS, you can beta test 20.04 LTS right now and help them squash bugs before what I think will be a very essential global pandemic stay-at-home OS release!)

So, since this is free and open source, what’s the business model?

Basically, you can grab this for free and have a nicer version of OBS. Tips and donations to content makers go 100% to you – no cut for Streamlabs. (Good – and a major difference with a lot of horrible startups.)

Then for a monthly fee, you can add additional effects (US$4.99/month, “PRO”), or a bunch of custom widgets, custom domain and website, and other extras (Prime, $12/mo billed annually).

streamlabs.com/pricing

I hope they allow month-to-month billing, but regardless, it’s nice to see a business built on open source software and that still has sustainable business support. (CDM is possible because of just that idea – thank WordPress.)

I’m sure some people are groaning at me even sharing this information, given how many streams are out there right now. But”streaming” doesn’t necessarily mean to a wide audience – it’s useful in any case where you want to teleport yourself around the world (while under stay-at-home orders, for instance) even if it’s to a small group. Plus, even if you haven’t been struggling with this yourself, now you can tip off your friends so they don’t a) bug you for how to set up their stream and/or b) stream really low-quality material you have to then watch.

And I think just as with blogs, the question is not really quantity or openness, but quality – and whether there’s a model for supporting the people putting out that quality. More on this soon.

The post Streamlabs is an easier, free all-in-one streaming app, now on Mac, Windows, iOS, and Android appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/streamlabs-free-streaming-mac-windows-ios-and-android/
via IFTTT

Categories
Uncategorized

Streamlabs is an easier, free all-in-one streaming app, now on Mac, Windows, iOS, and Android

Start with OBS, the now industry-standard streaming app, and add a bunch of special sauce to make it easier and friendlier. Now you’ve got Streamlabs – and it just added Mac support to its other platforms.

Mention live streaming any time in the past year or so, and someone no doubt told you to use OBS. Open Broadcaster Software, aka OBS Studio, is indeed free and powerful – not only for streaming but live recording, too. (It quietly displaced a lot of pricey and often incomplete commercial screencasting software, too.)

OBS has gotten a lot easier – a cash infusion from Twitch, Facebook, NVIDIA, and Logitech no doubt helped. But it’s still a bit intimidating as far as configuring settings for recording, to say nothing of the manual settings required to then make it upload to various streaming platforms.

That’s where Streamlabs comes in. It’s got its own desktop apps based on OBS, plus apps that let you easily stream from Android and iOS, too. So while you could do all of this on OBS desktop, Streamlabs makes it easier – basically, it’s a bit like having a custom distro of OBS. And then by adding mobile access, those platforms become easier, too.

Looks like OBS – but 100% less intimidating.

So in addition to all the things that make OBS powerful – using any video source or onscreen inputs, switching between them, handling resolutions and recording as well as connecting, you get:

  • Pre-configured streaming platforms and easy login (think YouTube, Twitch, Facebook, etc.)
  • Auto-optimized video settings
  • Custom alerts (so you can also beg for donations, add engagement)
  • Themes and widgets for customizing your stream
  • Built-in chat (normally requiring you to open another window in OBS, which gets surprisingly clumsy fast)
  • Easy recording
  • Cloud backups (so you don’t lose your recording)

streamlabs.com

Honestly, having played around with it a bit, maybe the best part of Streamlabs is that all the power of OBS is there, but easier to use. So it doesn’t feel like a dumbed-down version of OBS so much as a polished, beginner-friendly interface with all the same features – and some useful additions.

The easier-to-follow Sources dialog alone is probably worth the price of admission. And price of admission is free, anyway.

The mobile apps also feature a lot of nice integrations on these lines, too. Think similar cross-platform streaming support, importing OBS settings from desktop, and adding widgets for events, donations, and chat.

streamlabs.com/mobile-app

The spin here of OBS is open source, like its sibling. It’s based on Electron, so I hope that now that macOS was added, we’ll see Linux, too. Linux users should meanwhile note that OBS packaging has improved a lot across distros, and Ubuntu Studio for instance even bakes a pre-configured OBS right into the OS. I have no idea how much work would be required to do the same with Streamlabs. (PS, you can beta test 20.04 LTS right now and help them squash bugs before what I think will be a very essential global pandemic stay-at-home OS release!)

So, since this is free and open source, what’s the business model?

Basically, you can grab this for free and have a nicer version of OBS. Tips and donations to content makers go 100% to you – no cut for Streamlabs. (Good – and a major difference with a lot of horrible startups.)

Then for a monthly fee, you can add additional effects (US$4.99/month, “PRO”), or a bunch of custom widgets, custom domain and website, and other extras (Prime, $12/mo billed annually).

streamlabs.com/pricing

I hope they allow month-to-month billing, but regardless, it’s nice to see a business built on open source software and that still has sustainable business support. (CDM is possible because of just that idea – thank WordPress.)

I’m sure some people are groaning at me even sharing this information, given how many streams are out there right now. But”streaming” doesn’t necessarily mean to a wide audience – it’s useful in any case where you want to teleport yourself around the world (while under stay-at-home orders, for instance) even if it’s to a small group. Plus, even if you haven’t been struggling with this yourself, now you can tip off your friends so they don’t a) bug you for how to set up their stream and/or b) stream really low-quality material you have to then watch.

And I think just as with blogs, the question is not really quantity or openness, but quality – and whether there’s a model for supporting the people putting out that quality. More on this soon.

The post Streamlabs is an easier, free all-in-one streaming app, now on Mac, Windows, iOS, and Android appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/streamlabs-free-streaming-mac-windows-ios-and-android/
via IFTTT

Categories
Uncategorized

SoundCloud just added a direct support link; here’s how to turn it on for your profile

SoundCloud is talking about a host of new support initiatives, but there’s one you might want to switch on right now – a direct support button for your profile.

It goes without saying that people across the arts, entertainment, and many other industries are hitting hard times, and not everyone has a strong support net. It also goes without saying that music makers often struggle under the best of circumstances. SoundCloud is the latest in our arena to announce various initiatives – everything from working with Twitch to discounting SoundCloud Pro accounts to a set of news today. That includes $5 million in promotional support and a $10 million accelerator program (Repost Select).

But let’s get to that later, because the simplest thing SoundCloud is doing is to make a prominent link available on your profile page.

The direct fan-support-button is just a big blue box with a link of your choosing. There’s some pre-defined text reminding people that artists are impacted by COVID-19 and health interventions.

Here’s how it appears in action (currently live on the Web; mobile I’m told is coming):

Heck yeah, support Erika. CDM endorsement. Among others.

To add yours, go to your Creators profile page while logged in, and choose the”edit” button on the right. Then click “add support link,” and a new text field appears. Here you can add a link to a number of sites. (Custom links don’t work, but they are taking feedback for services you want if one isn’t listed.)

Add the link here. Note I wanted Bandcamp listed separately, too, so I added it twice.

Supported sites you can link to (for now):

Paypal.me
Cash.app
Venmo
Bandcamp
Shopify
Patreon
Kickstarter
Gofundme

Emphasis mine – it isn’t just about begging for money or passing the hat, because you can also direct people to buy your music and merch on Bandcamp, head to a store on Shopify (where you might even sell synthesizer hardware, for instance), subscribe on Patreon, or support a Kickstarter project.

It’s just a link – there’s no commission that goes to SoundCloud. Basically, it works like the existing links on your profile, but white-listing these sites on which you can get paid. (Oh yeah – you can even link Bandcamp twice, which is what I do; I want people’s support to come in the form of downloading my music, and I still want to list my Bandcamp page alongside Twitter and Facebook.)

Find the instructions here:

Direct Support Link [SoundCloud help]

Blog post: Add this new button to your profile so fans can financially support you [SoundCloud blog]

And an update from CEO Kerry Trainor on everything else

My experience is this: a lot of music lovers want to support artists. As we saw with Bandcamp’s music sale last week, as we’ve seen on Patreon and other services, people are looking for opportunities. That makes it even more tragic that big tech providers like Spotify, YouTube, and Apple tend to focus more on building their platform rather than allowing connections to artists. I hope that this step from SoundCloud is a first step toward more of that kind of direct support.

Stay tuned; we’ll soon talk to SoundCloud about their strategy with creators here, and hopefully unravel some of the other offerings.

But meanwhile, if you’re wondering if you should turn that button on, I say yes!

I was going to add more, but really – just yes.

The post SoundCloud just added a direct support link; here’s how to turn it on for your profile appeared first on CDM Create Digital Music.

from CDM Create Digital Music cdm.link/2020/04/soundcloud-direct-support-button/
via IFTTT