an ultimate Asus RT-N16 setup that can easily access your files or network while abroad, download your torrents automatically for you, serve your media, host your files, track all your bandwidth and give you fast WiFi
Herbert is a short new webseries I worked on as a labour of love to hone our crafts. It changes slightly as the gear used changes through the 8 episodes. I had a lot of fun making Herbert. I think episode 7 sounds pretty good.
It isn’t too Canadian though, right?
Pouring the maple on extra thick. We didn’t have a grant, a fact, a much, a bravo… Not a slow clap even. But we did it anyway. Is it because we were bored? Maybe. Were we hellbent on a “if we film it, they will come” fallacy that was a lapse in judgement, possibly. Maybe it was more so because we were broke so we had nothing better to do with our time. Judge for yourself if it is pure folly.
I had this issue with my upstream light blinking on my modem for a couple months and Teksavvy couldn’t solve the problem. It was a major factor in my deciding to shop around for other ISP’s at the time and really was serendipity in hindsight. My Thomson DCM-475 one day just started blinking. I hadn’t seen a modem blink like that before but I chalked it up to an anomaly of being a Teksavvy customer (never do that) and I was too busy with audio to stop and figure it out. Hell, the internet was still working. One day when I should have been doing something more fun, I was poking through the diagnostics of the modem to see what was amiss. I had seen a few error messages so I decided to try and get an RMA going in case the modem was kaput. I described the signal strength in the GUI to Tek’s support staff. They said the signal levels were a bit low but should be fine. They told me to reset the router as per the script they read to everyone and that was that.
http://192.168.100.1/ is the address to enter into your browser when you need to check out the diagnostics of your Thomson or Technicolor Modem
Only, that wasn’t simply it.
The US light kept flashing. I let it go for a while, and eventually it began bothering me again, so I called tech support and they said they would send out a technician. It was directly after an ice storm, so it would be a free service call.
I had to wait a week or so but finally, one snowy afternoon I saw a white unmarked van parked a few houses down. I knew the day of reckoning had come. The tech did a great job cleaning up the last tech’s mess. He fastened the exterior connection to the house, which was loose, re-terminated all the coaxial cables and brought in his own modem to confirm the issue was with my modem. Or so it seemed.
I called support again and told them the line is fine and my signal should be clear but something was wrong with the modem. They told me it was just out of warranty a few weeks ago. I was pretty miffed, naturally. . I went through all the troubleshooting steps and multiple calls and now I am just left screwed without a proper modem. I was going to get this damn upstream light blinking on my modem to go solid, period, end of sentence. I wasn’t going to buy a new one, however.
I took to DSLreports.com which is a great site for researching ISP’s. Someone on the forum pegged the error (much earlier) to an unbound connection. Previously, I had been unable to identify the post message, which is a sequence of flashing lights that you need to match up to a table to interpret the blinking lights. The US light wasn’t blinking like the manual described.
Unbound connection? Sounds like a provider issue to me, I was a bit irked. I then found someone else who had mentioned power levels for binding and it struck me. What if my PSU had failed and my voltages were drooping on the control channel? I know, you were probably thinking that all along…
I went and got another wall wart, ya know the black thing you stick into your already jammed up power bar. This is the kind you get lucky with and they’re offset so you can actually plug things in above and below it. Well, that’s a DC power supply. It steps down the current of the AC in your walls. It was just a matter of matching the same amperage, voltage and polarity to get a new one for a few bucks at Active Surplus and bam, the issue was solved. Polarity is marked with a little picture like this.
It saved me shilling out $100 for a modem that wasn’t broken so hopefully this helps someone out there put their mind at ease too.
Back in my last posting I spoke about backups and how you can use rsync to back things up reliably in a more complete way than finder. In spirit of backup day I will elaborate on scheduling this as REGULAR backups are really the only backups that save us the suffering of lost data.
This time we will take a look at crontab which stands for cron table. It runs daily, weekly and monthly on respective schedules. You can add commands to be run daily and they will be executed at the time scheduled or if your computer is sleeping, the next time you wake it. Using crontab is as simple as taking the commands you want to schedule and putting them into a plain text document with .command as the extension. I will use the rsync command here. Open terminal and punch in:
You will now be in the Documents folder of your home user.
Now punch in:
The touch command creates a blank file.
Then enter: vi rsyncbackup.command
This will bring you into a very archaic text editor that you will be forced to use on some versions of linux, it’s best to just accept it and learn how to get around in it as it will save you a hell of a lot of trouble. First of all using the esc key then entering colon allows you to enter a slew of commands. Just stick to what I am using here and you will be fine 😉 Now to edit the cron table we need to enter:
Now to enter text we need to hit the i key to enter insert mode, then type:
30 23 * * * ~/Documents/rsyncbackup.command
cron works on a 24 clock. Another thing, the file we just created with touch has no permissions for execution by the current user. Permissions lie at the core of security in *nix based operating systems. To make it executable enter:
chmod u+x rsyncbackup.command
Now the command should run every day at that time. The fields and acceptable values are in this order:
day of month 1-31
month 1-12 (or names, see below)
day of week 0-7 (0 or 7 is Sun, or use names)
Also a good tip to keep track of the sync is to add a file in the home folder of the current user with nothing in it except for your email address. In terminal again enter:
enter your email address
You will now receive an automatic email from the mail application on that system. This whole thing works really well if your the only user. If it’s a more complicated backup scenario you really need to plan it out for crontab to work. Happy backup day!
Asus has a wide range of electronics from motherboards to networking to laptops. I have maintained and purchased a lot of their stuff as it is well built and affordable. I had gone out and purchased a mid range router for $120 CDN and was pretty happy with its operation.
Then one day it just crapped out. Not “fully bit the biscuit” crapped out. More like the “can’t reliably negotiate wifi connections” kind of crapped out.
I filled out a ticket on the Asus RMA website and it was a day after one year of the date of purchase.
I was in luck. I felt pretty comfortable as I had luck with Asus RMA’ing a motherboard that had failed at the very end of it’s life at work. I don’t think it was under warranty as it was a few years old but I called it in anway.
I lit a cigarette (I was still smoking at the time) and I called support to advance my ticket. As I was casually discussing the issue. Feeling pretty sure that I was a respected customer and service was Asus’ MO, I off handedly mentioned my router had been dropped. I could have kicked myself. Why did I just admit to physical damage? Could he stop the ticket there dead? Shite, I am out $150 bucks because of the grotesquely inadequate Toronto power grid.
But no. The agent simply filled the ticket so they could “take a look”. They soldered in a new chip and sent it back. I could smell the sweet odour of solders too heavily laid on the board. I wrote back the next week when performance hadn’t improved. It was still getting dismal speeds and having difficulty authenticating clients.
At this time it was revealed to me that I had an issue with the DC power supply for my modem. Swapping the inexpensive power supply fixed an issue I had been suffering from for months. Even though tech support assured me I needed a new modem.
I mentioned this to the agent and he suggested I send in all components including power supply and antennae for proper testing. Duh. Now I am patiently waiting to see what comes of my router.
It is a great router with tons of features and I’ll walk you through them just as soon as I get it back. I’m also shopping for a good UPS as well. I still have weird hiccups in my AC at my new apartment. But forget about it Jake, it’s chinatown.
A friend of mine posted on social media that his faithful MacBook Pro had died on him and left small bits of his projects out of reach on his hard disk. ( He had IT rustle them up)
This came at a time when I was experimenting with scripting rsync and SSH to backup a server to a secondary machine in the event things go wrong. Things do go wrong every now and again. Sometimes very wrong, like when you decided to skip putting a baking sheet under that mostly defrosted and fairly gooey Delissio pizza.
It’s easy to make backups in Mac OS X using rsync even to a company server. There are a few things you should know beforehand and your company IT guy can usually help you out with the specifics of your enterprise server. The thing about being the administrator is you are your own gatekeeper. You can easily work the angles available to you.
If you’re on your own, in a smaller company, at home or just want to take a stab at automating your backups yourself then this guide is for you.
rsync is a powerful command line utility and should be respected! Please read the whole posting before punching these commands in! As a sync tool it has the ability to delete material in the destination folder!
Rsync is a powerful unilateral sync tool that can retain all of your data with all of its attributes such as ownership, permissions, dates and so forth and it does so with compression and an algorithm that allows for incremental backups that are super fast. I have to say, it is very fast and unobtrusive to even for a full backup of a fairly large directory. The compression function boosts data throughput for the transfer and makes remote backups way less of a pain. It can also resume transfers that have failed. Along with a whole bunch of other features. Andrew Tridgel the creator of rsync said it would outlive a host of his other creations.
I use the Asus RT-N16 as my home router and it kind of runs like my Linux server as well. With a 1TB disk plugged in and mounted you can use it for incremental backups, as a media server and for always on torrenting. In this case I am going to be scheduling a local directory from my Mac’s desktop, to be backed up daily and weekly.
Rsync unlike other copy commands will remove old files that have since been deleted “syncing” the two folder or drives. This can be turned off by adding the flag –delete and your disk will grow as you go until there is no more space.
This essentially means rsync mirrors one directory to another. With a little more doing, one can have rsync keep snapshots of the disk from a few days running so that if you deleted something yesterday and the backup script ran your files still exist. This process is well explained at the previous link and requires the use of symbolic links that create a skeleton of the directory with only the files that have been changed on hand for if you need them. The “skeletal” folders only contain links to the original backup. You can script this whole process with Automator or as a cron job and then reliable backups are opened up to you with no cost and just a little elbow grease.
Heck, I love Apple’s time machine but that only works reliably when you have a network attached disk taking regular backups. Not to mention it is really a bugger trying to sort out a time machine drive. Consequently I have 2 of them sitting idly, in case I should think of some file I had forgotten about that happens to be have survived in my time machine. Not likely…
The syntax of rsync is unix based and is the same on OS X and various flavours of Linux. Actually learning rsync and cron scheduling is a good jump off point between the two operating systems. Knowing some basic Unix commands can really help you become a power user on any Mac OS.
rsync -avzn ssh /home/path/to/sourcefile/ user@IPaddress:/root/path/todestination/
Breaking the syntax down is easy:
rsync is the command and opens the syntax
-avz are the flags which delineate between all the options available for the sync. The a = archive mode which is a combination of a bunch of options meant for backups (rlptgoD). I have also included the -n flag which will only do a dry run to test the file copy.
-r, –recursive recurse into directoriesIt is important to note that when using the archive flag there will be differences when recopying with simpler options. The files won’t match in attributes and will be copied again!
-l, –links copy symlinks as symlinks
-p, –perms preserve permissions
-P, –progress and partial combined
–progress, shows the progress in the terminal
-t, –times preserve modification times
-g, –group preserve group
-o, –owner preserve owner (super-user only)
-D same as –devices –specials
–devices preserve device files (super-user only)
–specials preserve special files
It is also important to note to SSH into a server requires the correct key files to be in place. SSH is a widely used remote shell for administration. With it you can copy files as if you were the super user on that linux machine provided you log in with that account. If you need to do more than backing up files and want to back up a system then it will be necessary unless you are on that machine locally.
The simplest method to rsync to a company server would be:
rsync -avzPn ‘~/Desktop/Files\ for\ Work/’ /Volumes/MountedServer/Mydirectory/Backups/The ~/ or tilde before the slash is a unix shortcut to the logged in user’s home folder!
The tricky part here is the spaces within the syntax. Spaces generally are the breaks in syntax that the shell understands as a cue to move on to the next part of the expected syntax. So that “for Work” without the slashes and single quotes would look like the destination path and effectively breaks the command when that path isn’t found. A better explanation is here.
Running that command with the syntax all in place should give you a time for transfer and a speedup time with no to minimal errors. Next time I will tell you how to schedule it using cron tables built into OS X.
Earlier this month I moved to a more central location. Change is good, especially when it saves money and makes life easier. HowToGeek wrote an article which also demonstrated a very reasonable way to go ISP shopping in Ontario. I figured if the culture shock of moving to Chinatown can wear off in a few weeks I could probably bear to change my ISP as I was not on contract. I own my modem as well and they promised almost no downtime and no contractual obligation as well.
Why go ISP shopping if you’re happy?
An ISP or internet service provider chooses the speeds your data can flow at, what traffic they charge you for and how much. I landed at Start. They offer plus packages that have speeds at 25/10 for under fifty bucks if you stick to your data cap. If you’re on your own it should be fine but if you have room mates you may want to opt for the unlimited package for $10 more. Their ratings on DSLReports are pretty promising. If you aren’t under contract then what is the harm in shopping around?
They offer a great cable package with great upload speeds and don’t charge for upstream traffic. If you were thinking about hosting an FTP server for your home office this would be the place to go. The comparables don’t have a lot going for them at this point other than I love how Teksavvy stood for net neutrality. (I still love you Tek)
I will be the first to do the acid test and change providers and I will blog it here for your benefit.
So far I have had nothing but joy with this provider. Also good news in the light of yesterday’s announcement by the FCC that net neutrality is over, Start customers are on a “verified” network
I am really enjoying having a fast way to post material to the web.
I have always been partial to offline processing media when I was editing. Especially, back when I started editing. The advantage of A-B comparison and highly tuned parameters were all I could see. I would spend hours after an edit processing individual clips or scenes. Sometimes to my detriment.
As someone wise once told me you need to see the forrest through the trees; in editing as in life.
Enter the question: which workflow is better when working with denoisers? The answer is disputed in friendly both are valid workflows it is a matter of the application of reassigning tracks vs. processing.
Back then, online processing required the suite you were cutting in to be particularly beefy, with CPU clock speed being a major factor. That isn’t so any more, with the newest algorithms and high speed i7 processors. It was time for me to reevaluate my notions of online processing.
I would spend hours on a returning pass of dialogue, processing clip by clip sometimes just to maintain the level of noise floor. It was painstaking work requiring processing and endlessly comparing the resultant files to the originals. The start/stop nature of the work is tedious. Especially the indecision of inexperience.
The process of setting a track to handle the noisy media, allocating all noisy media to that track and hoping the machine wouldn’t choke on it seemed too cludgey. I would take pride in a dialogue edit that played like a mix. I had a luxury then that on most of the shows I have cut or mixed since, hasn’t been available to me: ample time. Now, I see how that even allocating can be time consuming.
When handling someone else’s tracks, that hadn’t taken denoising into account reassigning can be a god send, saving tons of time.
At this time I believe all the major DAWs for film post will allow for automatable bussing reassignment. If you are stuck in the old workflow try this instead.
The process of reassigning busses in pro tools is version specific. I believe that HD versions of pro tools allow you to reassign busses easily. In the non-HD version you must make a null auxiliary track to work around this. Prefader sends work but are a bit more cumbersome in operation. You must make sure that mutes (in your preferences) are not pre-fader or that your fader is down. Every time you automate the bussing of a section you must be sure not to double bus the signal. With bus reassignments you simply loop the section as normal and select the alternate bus.
That said if I am given ample time to edit I still go for processing as my edit suite is still a Q9550 series Intel quad core processor. In a complicated template the VST performance meter has hardly any headroom and to throw another algorithm in the mix introduces clicks and pops and stalling on playback.
The truth is no matter which approach you take it is always important to compare the original audio against. your processes. I find in the online workflow it is easier to leave the processor turned up a bit high or engaged when it need not be.
Recently the team at ImagesInSound and I were nominated for a Canadian Screen Award for Best Sound in an Information/Documentary or Lifestyle Program or Series. I’m honoured to be nominated.
Mayday is a great show to work on and all the above the line people are top notch. The great Michael Bonini’s jam packed tracks helped take this season to a new level. I am very pleased to be nominated with so many very talented people.
The full list of CSA nominees can be found here.
If you happen to be an academy member please vote for us!
I decided to flush Ubuntu which aimed at social media and decided to go with a distro I had heard a lot about just talking to friends and colleagues. Linux Mint with cinnamon is very stable (Nvida 319 driver update) almost out of the box on my GTX 260. It’s got a classic interface which responds snappily. I’d say the visuals in Mint are less impressive but refined and orderly so anyone that sat down at it could navigate the software installed. It isn’t as contextual of a launcher as unity, searching of things like ‘audio’ won’t reveal the settings panel relating to it. Linux Mint is a friendly supported distro that has a large user base like ubuntu. Mint also comes with transmission installed and a whole bunch of useful (open source) software to get a full desktop experience right out of the box and it’s lean for laptops that are getting long in the tooth. Think, pre core 2 duo. It’s stable, has a large user base and is based on Debian. Linux Mint also has a pretty good package manager for installing software and adding repositories and all the other little interface features you come to expect from the bigger distributions. If you want a Linux box at home or in the office it’s not a bad option.
Not a bad version of Linux for aging gear.