Any way to get an oscillator to draw the waveform to track in the sequencer lanes?
I would love to be able to take my favorite oscillator (Ammo) and have that bad boy draw it's wave forms to track in the sequencer lanes, then use those pattern to control the different controls of my devices. Does that make sense? And is there any way to do this? I really can't think how it is possible with the current version (or any version) of Reason.
As far as I remember there is audio to CV in Thor`s mod matrix. Here it is!
Budapest, Hungary
Reason 11 Suite
Lenovo ThinkPad e520 Win10x64 8GB RAM Intel i5-2520M 2,5-3,2 GHz and AMD 6630M with 1GB of memory.
Im not shure what you mean but what about
ammo to cv in thor,
inside thor cv in to audio output
The audio output to a audio channel
Record
And your should have those lines
???
ammo to cv in thor,
inside thor cv in to audio output
The audio output to a audio channel
Record
And your should have those lines
???
- Raveshaper
- Posts: 1089
- Joined: 16 Jan 2015
I would recommend attaching the signals from Ammo to a combinator and mapping them to devices similar to those you plan to control later. By Alt clicking the controls that Ammo is manipulating and then recording automation for each newly created automation lane, you should be able to generate automation without converting to audio. But maybe Ammo works differently?
Enhanced by DataBridge v5
You mean, to transform them into automation clips ? Because thats possible by sending the waveform out through an EMI, and have that same midi CC loopback into reason (method dependant on OS)
But maybe QwaizinGs method works too.. Never tried that.
But maybe QwaizinGs method works too.. Never tried that.
V9 | i7 5930 | Motu 828 MK3 | Win 10
This might be of use.
https://youtu.be/6y0Ro2M1Sh4
The toolbox doesn't appear to be available on PH website anymore, but I found a copy on the Internet Archive Website.
https://web.archive.org/web/20130616132 ... f/JB34.zip
https://youtu.be/6y0Ro2M1Sh4
The toolbox doesn't appear to be available on PH website anymore, but I found a copy on the Internet Archive Website.
https://web.archive.org/web/20130616132 ... f/JB34.zip
Thanks guys. Yes, I am trying to make automation clips, which are easier to see and control than a straight LFO cv input. I am lousy at drawing curves, so I wanted to use the precision and perfection of Ammo. No time for a big reply, I have to run to work, but I will try some of your ideas later and report back here. I didn't think to use Thor's audio to cv. That might work.
Thanks again for your replies.
Thanks again for your replies.
- Raveshaper
- Posts: 1089
- Joined: 16 Jan 2015
Getting smooth curves is something I'm trying to figure out myself. Perhaps someday I'll have a good way of doing this, but for now it's theoretical.
Enhanced by DataBridge v5
- jfrichards
- Posts: 1310
- Joined: 15 Jan 2015
- Location: Sunnyvale, CA
Here are the above mentioned James Bernard automation curves in R8:
- Attachments
-
- automation_curves.reason.zip
- (177.06 KiB) Downloaded 104 times
These James Bernard files are helpful, but I want more choices. That's why I wanted to do this with Ammo (128 different wave forms).
None of these other solutions are really working. If I convert CV to audio and record the audio, the clips can not be dragged into the automation lanes in the sequencer. I don't know of any way to convert audio waves to automation lanes. So I don' think CV to audio is going to solve this issue.
The EMI method is crashing my system (too many notes are being recorded), and I don't see it even recording the automation, but I could be doing this wrong. I haven't used this method for recording AutoTheory for about 6-9 months now... I may have forgotten how to do it, exactly.
The combinator method doesn't seem to do anything, either. I mapped the instrument to the rotary knob, attach an LFO to the CV in of said knob, hit record.... and NOTHING... no automation gets recorded. The LFO moves the knob, but I can't figure out how to get it to record the automation I am seeing.
I'm not sure if you guys completely understand what I am trying to do/asking. I blame my lack of articulation. I am essentially trying to turn an LFO waveform into an automation lane (kind of like notes to track, but for automation lanes). I want the LFO waveform to draw or record itself into the automation lane. Then I can drag and drop it, time stretch it in places, cut it up, copy paste, ect... total control. I'm not sure this is possible. Does anybody have any other ideas I should try?
Simply put - LFO to automation lane is not available in Reason.
But, there are so many things you can do with CV routing or automation and/or that JB toolbox, I don't see the issue here. For example:
But, there are so many things you can do with CV routing or automation and/or that JB toolbox, I don't see the issue here. For example:
So, you essentially want to alter them anyway, right? Why they have to be the exact shape then? Does it really matter to you, that the shape created is made from a specific source, when you're going to play around with it like that? Maybe I'm missing the point here, but between recording CV modulated stuff to audio and slicing it up and using pre-made automation curves or drawing random stuff, freehand or line, do you miss that feature that much? Throw Synchronous to that and you should get *insane* by all the possibilities. IMO as usual!Then I can drag and drop it, time stretch it in places, cut it up, copy paste, ect...
- Raveshaper
- Posts: 1089
- Joined: 16 Jan 2015
The second piece of the puzzle when converting CV to audio is that you have to convert the direct output of the audio track back to CV, then apply the signal to the destination device(s).
Unfortunately, recording automation requires using an EMI with the signal being returned by midi loopback through a physical midi port. This introduces latency into the capture process and can cause midi feedback loops if the device sending the signal to the EMI is selected during recording. The only way to kill a midi feedback loop is to restart Reason.
Unfortunately, recording automation requires using an EMI with the signal being returned by midi loopback through a physical midi port. This introduces latency into the capture process and can cause midi feedback loops if the device sending the signal to the EMI is selected during recording. The only way to kill a midi feedback loop is to restart Reason.
Enhanced by DataBridge v5
- submonsterz
- Posts: 989
- Joined: 07 Feb 2015
The second piece of the puzzle when converting CV to audio is that you have to convert the direct output of the audio track back to CV, then apply the signal to the destination device(s).QwaizanG wrote:
Unfortunately, recording automation requires using an EMI with the signal being returned by midi loopback through a physical midi port. This introduces latency into the capture process and can cause midi feedback loops if the device sending the signal to the EMI is selected during recording. The only way to kill a midi feedback loop is to restart Reason.[/QUOTEsay whaaat. ..
why you using emi ?? This is easy with just thor ...
any cv in to thor audio so any cv signal (lfo) what ever in converted to audio to a mix channel via thor audio out. then audio back into thor audio to cv simple ...
why you using emi ?.
No loop backs there.
You can also stretch the shit out of any sample and use that as the modulator or as I found what are nice shapes to play with are the single cycle loops just shove a shit load of various ones on a sample lane have that routed to thor as cv to destination simples.... .
This can be done by creating a mix channel and putting it on Rec mode, which then allows you to treat that mix channel as an input to an audio track in the sequencer.challism wrote:I would love to be able to take my favorite oscillator (Ammo) and have that bad boy draw it's wave forms to track in the sequencer lanes, then use those pattern to control the different controls of my devices. Does that make sense? And is there any way to do this? I really can't think how it is possible with the current version (or any version) of Reason.
This records as audio, which can be converted to cv via Thor, though I've just realised you wanted to be able to edit it.
- submonsterz
- Posts: 989
- Joined: 07 Feb 2015
akk didn't read the thread before posting lol. sorry Q lol. Didn't see he wants it as automation clips To draw and edit into ..... only way to edit audio drawn ones is the stretch overlap mash together the clips . Def not the same as being able to hand edit with pencil tool or the like on a clip of automation .QwaizanG wrote:The second piece of the puzzle when converting CV to audio is that you have to convert the direct output of the audio track back to CV, then apply the signal to the destination device(s).
Unfortunately, recording automation requires using an EMI with the signal being returned by midi loopback through a physical midi port. This introduces latency into the capture process and can cause midi feedback loops if the device sending the signal to the EMI is selected during recording. The only way to kill a midi feedback loop is to restart Reason.[/QUOTEsay whaaat. ..
why you using emi ?? This is easy with just thor ...
any cv in to thor audio so any cv signal (lfo) what ever in converted to audio to a mix channel via thor audio out. then audio back into thor audio to cv simple ...
why you using emi ?.
No loop backs there.
You can also stretch the shit out of any sample and use that as the modulator or as I found what are nice shapes to play with are the single cycle loops just shove a shit load of various ones on a sample lane have that routed to thor as cv to destination simples.... .
-
- Information
-
Who is online
Users browsing this forum: No registered users and 6 guests