Feeling my way

So Apple’s looking for somebody to ‘Take Maps to the Next Level‘. If ever there was a phrase that falls with a dull thud, there it is.

There’s nothing wrong with maps. Not Maps, just maps in general. They’re great. It’s an amazing spin on our experience of the world, where our vision is translated into a top down view of the world. We dream of flying, yet our imagination does this over abstract concepts and three dimensional experiences with ease every day. It’s not even a particularly modern or hi-tech thing, but more an innate human understanding, as the Mappa Mundi and Australian aboriginal art demonstrates.

So that’s why I have two rather un-thumbed tomes on my desk: Objective C for Dummies and iPhone Application Development for Dummies. Hey, I had a Sinclair ZX81 and learned BASIC on a Commodore PET, I’ve written Lingo that makes an Interative CD-ROM do vaguely useful things. I too can write iPhone apps!

Because maps are Old Skool.

When the iPhone 3Gs came out, with its combination of GPS and compass, I was so excited. In an interface-geek kind of way.

I want to fondle my iPhone in my pocket or wear it up my sleeve. I want to wave it hither and thither like a hyper accurate dowsing rod and follow a route that you can feel as little ‘bumps’ by rolling over a virtual string that’s been created by location-aware helper apps.

Your GPS location and your iPhone’s compass orientation work together to give a simple non-visual feedback that works in any language, in any environment. Reach out and feel the virtual guide rope. As you wave your arm around, or simply spin it in your pocket, there’s a little ‘clunk’ – not a buzz, but a short yet heavy ‘clunk’ you can rock over. Like rolling a mouse over a big bit of grit. Just like Derren Brown feeling for micro-motor anomalies in an Italian passer-by, but a lot easier and quicker; you navigate round a strange space by a sort of virtual touch.

So all I need is to work out a direction finding routine – surely built into Maps already, and tap into the APIs for the GPS, the Compass and the wobbler (sorry, the documentation I’ve read so far doesn’t say what the API refers to to make the thing go ‘clunk’).

I’ll then generate some really great marketing spin at the tail end of beta testing – do some viral video with lots of people waving stuff in front of them, the parody of dowsing, then cutting a deal into a bit of pulp fiction centering around some American city that also has ties into more European cities, then sit back and wait for the millions to roll in from the iStore.

Except I fell at the first fence.

I really shouldn’t write code. I am really bad at it, I don’t have the mathematical knowledge, the patience or the raw skill to get beyond the ‘hello world’ stage. And I haven’t enjoyed getting that far. It’s like trying to write poetry in a foreign language or write a National Anthem for an obscure musical instrument. You really need to know stuff that’s not about what you want to do. There’s so much stuff you need to know just to get over the Programming 101 that, well, really, look – I don’t do Pointers or memory management or all that stuff. I thought I could explain a bubble sort, but I got it all wrong. Programming will shorten my life, and the gravestone will have a syntax error.

So maybe I’ll make that ‘iPhone Torch’ app that’s a tenth of the quality of the worst of iStore but I will use because I WROTE IT (no I didn’t, I copied the code from an example and modified it in the hope I could make a 2900K version but settled for ‘white’). And even that will develop a memory leak and my once reliable iPhone will require a twice-weekly restart until I restore the thing from scratch.

So folks, ideas are cheap. Implementing them is really hard. Funding their implementation is extremely risky. Risk gets more reward than hard work. Hard work gets more reward than coming up with an idea. But coming up with an idea, working hard at it and backing it up at risk to yourself can be very successful… or not.

So I really hope we can take Maps to the next level. Not just super-maps, but something beyond abstracted wiggly lines. Even just a little quiet variable-pitch whistle that does the ‘warmer/colder’ childhood game to find your goal.

And no, that’s not my idea. Ian Flemming, Goldfinger, the book, not the film.

Dedo LEDzilla – a lustworthy toy

ledzillaI’ll admit right up front that one of the joys of the work I do is the acceptance that a lot of the kit I use is actually my own grown-up toybox. There are my favourites, there are some let-downs, and best of all there are the amazing press-nose-to-window must haves to yearn for. And the latest one of these is the Dedo Onboard LED Lamp, affectionately known as ‘LEDzilla’.

At first glance, it’s another in the litany of on-camera lamps – sun guns, bashers, reporter lamps, many of which are used like searchlights in the dark. As such, their lighting quality is more technical than aesthetic, being the equivalent of an on-camera flashgun for stills. Sure, it illuminates what ever is in front of the camera, but the subject ends up like a medical specimen or a startled rabbit rather than a beautiful picture. News doesn’t happen where the best light can be found, and there comes a point in any videographer’s work roster where your camera is going to need some help. But that’s where many of these devices sat.

The issue with on-camera lights and dark surroundings is one of the fundamental laws of physics: the Inverse Square Law. Basically, if you have a light close to your subject, the amount of fall-off is pretty huge, so the tip of a nose could be over exposed and the ears too dark. As distance increases, so the fall-off gets much better, but the amount of light in general falls off too, so you have to start off with more light, or focus that light into a narrow tunnel – and putting lenses on lights (Dedo owners have already guessed where I’m going with this) makes them heavy and not exactly camera-top friendly. Throwing more light without lenses requires a lot more power, and that brings back unhappy memories of doing a Chewbacca impression by wearing a couple of PAG belts to power a sun-gun and camera (as the camera assistant I might add – in the days of plumbicon ENG cameras) but I digress.

I’ve purchased a lovely little ‘helper’ lamp – the LitePanels Micro which has done a great job of filling in the shadows under eyes in office lighting, putting a little sparkle in hair if held over the top of an interviewee, or to cast a ‘screen glow’ from a computer display or laptop. Its light weight and battery power means it can be stuck in all sorts of odd places. But it’s a close-up lamp. The LEDs’ light dissipates too quickly for any sort of distance work beyond a couple of feet as a primary source, and maybe double that as a fill.

But overall power of a lamp isn’t the deciding factor. As cameras get more and more sensitive in low light, you need some control to perhaps lift an interviewee out of a dim background by washing just a leetle bit of light, not blasting them out like an out-take from close encounters. So the dimmability of a fill-in lamp is extremely desirable, to take ambient light up one stop, or to fill ambient light and take shadows up to one stop below.

So to cut to the chase: Rick and I meet up at a big job recently, and he’s got his new Dedo LEDzilla. In the daylit foyer of a big convention centre (okay, let’s name drop – the Palais des Festival in Cannes), he’s able to boost the shadows enough to give a nice glow to faces in a contrasty lighting area (with spill from neon, tungten and the like). The rushes are great! I get to play with it a bit – it’s a miniature, battery powered Dedolight. The Dedo lens system is there, focusing an intense beam for long throws, or putting out a nice wide glow with edge to edge evenness (no bright spots or dim doughnuts, or spurious colouration towards the edge). A flip down Tungsten filter doesn’t knock the lamp back much – loads of power to spare. And it seems to run forever on a Z1 battery.

Okay, so it’s a hard light source. Nothing wrong with that – key lights were hard for ages. It takes more precision to set a hard keylight as a soft key is very forgiving, but I’d be tempted to use it more in natural lighting situations as a filler, as well as a hard key in indoor situations like voxpops – even if it goes on a bendy arm hanging off the tripod to get it off the camera’s axis. It’s light enough and wire free to have in a stuff-bag, and whilst an Arri Magic Arm might be required for a mains powered Dedo, the LEDzilla is small enough to be supported on a gorillagrip or gooseneck.

And because it’s small and battery powered, it’s a great effect lamp too. We did a little setup where I could hide the lamp between a couple of props, and because the light is focused through lenses, there’s so little spill that the attached barn doors are for shaping rather than flagging – so its position is invisible.

In comparison with other LED lamps I’ve played with, it’s an incredibly well thought out tool: the lamp body is like an anglepoise lamp – getting the lamp away from the lens axis to get some modelling from shadows. The spot and flood control is incredible when you consider how small this unit is. The dimming is a lot smoother than the LitePanels and can do that ‘gleam’ barely-on setting that will lift shadows in dark environments. It will also throw a beam of light across a room with enough oomph to key an talking head.

Although I’ll be keeping the LitePanels, I know I’ll get far more use out of the LEDzilla, and I can even fantasize about having several as a miniature interview lighting setup.

Sony updates SxS drivers for Snow Leopard, some bugs remain (not any more)

UPDATE:
Snow Leopard users should now be using XDCAM Transfer version 2.11, which cures the bug with previews, and therefore brings us full compatibility with Snow Leopard.

Oyvind Stokkan has posted the following link on the DVinfo.net board for new drivers for Snow Leopard users.

http://www.sony.ca/promedia/drivers.htm

http://support.sonybiz.ca/esupport/eSupportHome.asp

Thanks to Oyvind for passing this on! I’ve installed it (the download provides an unistaller for the old one, which common sense dictates one should use before installing the new one). And no more Panics! SxS is safe to use.

Now, in an ideal world, when we register a camera like the EX1 with Sony, and they ask us what platform we edit on and which software we use, plus an email address, Sony would have checked through its database of FCP using Mac owners and emailed them to let us know this update was available. Just in passing. Hey, it would have been nice for them to email us about the problem too, but perhaps that might have been a little dangerous.

But I’d prefer to give Sony a big thumbs up for fixing this issue quickly. Snow Leopard has been in Beta for some time, and by the laws of averages, somebody probably knew about this and rang the alarm bell under the cover of a Non Disclosure Agreement, but that is just a matter of conjecture and of little importance. SxS cards now work in Snow Leopard.

All we need now is a fix for the Cache bug in XDCAM Transfer. The tool spits up an error for every clip, complaining that “The clip thumbnail could not be saved to the cache as it was an unrecognised format” – though clicking Okay builds an uncached thumbnail that needs rebuilding each time you click it. Just keep clicking and things happen, but it’s annoying enough to warrant a moan – nay, a toddler style whine – to Sony for a fix.

PS: You could use FCP’s log and whatever, but my workflow is based around easy batch naming and embedding metadata into each clip.

How to kill a Snow Leopard

sickleopard

IMPORTANT UPDATE: Sony has released new SxS drivers for Snow Leopard – http://www.sony.ca/promedia/drivers.htm – thanks to Oyvind Stokkan on the DVinfo.net board for passing this on! But for the archive, the rest of the post goes thusly…

Somebody has to be first. Somebody brave or stupid. Or somebody with a full backup sitting on a hard disk in the case of moving up to a new operating system, because there be dragons.

On the morning of the 28th August, I was there in my local store with half a dozen fellow MacBraves queueing up to purchase the latest Apple Operating System With A Feline Name – and this one’s Snow Leopard. Probably because ‘it’s like the previous one, Leopard, only cooler’. Indeed it is in many ways. You’ll find lots of interesting and learned information about it elsewhere on the web, so I will refrain.

So I pop the install DVD into my backup machine, previously Time Machined, and just hit the button. Just like any user would do. That’s the way Apple wants us to experience things – shove it in, click the button and go and do something else for an hour. Ping, there’s your ‘new’ Mac, freshly booted, looking the same but strangely different, like it’s had an incredibly expensive haircut that you’re supposed to notice. And yes, the first fifteen minutes are great. Everything works straight out of the box. No awful hangs even when quite serious software opens up.

I will point out, though, that when I shove a Sony SxS card (what my cameras shoot onto), my Snow Leopard machine acts as if Derren Brown hit it with a long hard stare. The screen slowly wipes down a sort of half-tint darkness and a little dark box appears in the middle of the screen, with calm white lettering telling me to shut down the computer by holding the power button and then switching it on again. This is known as a Kernel Panic, and it is a very, very rare thing.

No, it’s not a missing driver, it’s a bug. If you start the machine up with the SxS card in it, the same thing happens BEFORE you get to the blue screen of life (Macs’ screens go blue just before you get to your desktop or choose your log-in, in a sort of antethisis of Windows’ BSoD). So it’s pretty terminal:

The solution, at the moment, is to stick to using my MxR adaptors, which use the same slots but work using USB magic rather than SxS magic. The point is that they work, so the slot is working, and XDCAM Transfer is working, so the drivers it installed are working.

So next step is to reinstall everything and try again. But to be honest, my preferred step is to go back to Time Machine, ditch the Cool Cat and catch some rays before the weather cottons on that it’s a bank holiday weekend.

Well… it’s been interesting being a pioneer striding into the future of computing – albeit a pioneer attached to the past by a nice strong bungie rope. Next time I’ll try this will be in December.

Clip Art

Clip artIf you own the popular Sennheiser Evolution 100 G2 wireless mic system, you’ll be aware of the horrible, pig-ugly, cheap and tacky clothes pegs that are attached to their microphones. Owners have been crying out for a replacement, literally begging suppliers for something – anything – that will clip the little, non-standard microphones onto stuff.

Well, we can all buy a beer for Marcus Durham, who tipped me the wink about using the Shure WL93 Tie clip, for about a tenner for two (one left hand, one right hand). They’re plastic, and not exactly invisible, but ye gods, what a difference!

I’ve looked through a few of my videos recently (see the post about show reels), and I’m wincing every time the ghastly appendages crop up like big plastic dung beetles. My son demonstrates the difference.

_MAT9787

These replacement clips are an absolutely mandatory purchase for every G2 owner.

Excuses

iphonematteI was going to run a little April Fools gag this year, and never got round to it: basically, put an iPhone into a full 15mm rails system with a nice big matte box and 35mm DoF adaptor, as the ultimate cinema verite rig. And of course Alex Lindsay has done just that in the latest edition of MacBreak: http://www.pixelcorps.tv/macbreak227.

But it has got me seriously thinking about the iPhone as a video device.

Sure there’s better cameras (video and stills), there are better cameras on phones, but here is a device that trims and publishes as well as shoots. But if it were to roll several trimmed clips together… That’s going to be quite a killer feature. There have been devices that purport to edit clips ‘in camera’ but frankly it’s been too painful to get beyond a proof of concept. But tie in a cut-down version of iMovie into a device that you slip into a pocket, and you’ll be more tempted to polish and publish little video postcards than ever before. IMovie is great and very quick, but you still feel like you’re starting a mega project as you ingest your footage and scan through your rushes.

Often a snap is transformed by a little cropping, some finesse with the levels and perhaps even a caption. A video that contains an Establishing Shot, a master shot, some cutaways, and a couple of captions and even a voiceover, is a programme. Not a very complex programme, but completely watchable. Which is the crux.

Sometimes the enemy of good is ‘better’ – when we lose the plot and get hung up on production values and gear and everything, when the moment is now. Transient. Fleeting. You can shoot it on a little solid state mini camcorder, but then you’re in search of a laptop to download it to, and a power outlet to feed it whilst you edit. Meanwhile, the iPhone movie has been topped and tailed and is working its way up to YouTube (albeit slowly via 3G).

I’m not going to sell my EX1 and start making movies on an iPhone. But I still think that this could be a new genre of film making. The sort of ‘Bar Camp’ to the usual ‘Moscone Keynote’.

Well, that’s my excuse for buying a 3GS. What’s yours?

alt.edit.final.final.final

It’s the last 10 yards of an edit. Well actually, today it’s two edits symultaneously.

Unlike, I imagine, big beefy long-form edits, corporate edits spend the first 80% of their edit being chopped together with major things happening, large scale grading and audio finessing, and then spend the next 80% of the time being tweaked to death. A caption update here (oopsie – re-render), a shuffle of sections there (necessitating a complete reshuffle of the music edit), make this slower, make that faster. It’s what we do for a living, and quite frankly it’s the shift from the ‘Director’s cut’ to the ‘Release Print’. Not all Director Cuts are, if the truth be told, ‘better’.

So I’m doing major version controllage on filenames, i.e. I export my FCP movie as ‘Main edit – final’, then there’s a slight change, so it becomes ‘Main edit – final final’ and so on. Yes you do, don’t get all coy, we all do it. So we start being all professional about it and using numbers for version control, and the real ‘greybeards’ start their versions with a leading zero, knowing that all final edits end in double figures.

All this is fine. What gives me the time to vent frustration on this blog whilst YET ANOTHER version of my edit(s) ooze out of FCP is that rendering as you go can lead to problems. Rick – we’ve both been here, and the mystery is in rogue render files shared between sequences within a project.

When making tweaks to a sequence, requiring outputs two, three or four times a day, it feels good to render everything out and allow only the bits you tweak to change. Saves lots of time.

But I’ve found something horrible: what you see in the Canvas window does NOT necessarily equate to what you’ll get in the exported sequence. I had an imported movie from Google Earth Pro, which had a glitch in it, and an overlaid graphic sequence from Motion. Everything played fine within FCP. When I exported it, haunting flashbacks from non-working versions over-wrote what I saw in the Canvas. As soon as a transition started, the background (google earth pro movie) snapped from the latest version to an earlier lame version. So tweak the transition by a frame, force a re-render, but the export was the SAME. Ditto other glitches.

Bottom line: I’ve tried to flush out ALL renders for the whole project in order to get the Export to Quicktime (make self contained) work. Nope. Perhaps I should have checked the ‘re-render’ box instead, but Nope. Glitches still existed from phantom renders – and I think it was to do with other sequences in the same project using the same renders, therefore they DON’T get deleted even if you ask them to.

I finally got what I wanted by exporting to a different codec, making it self contained and re-rendering the whole thing. Took six times longer to do the final export of course, so only do it if you have to, or as your final stage. I wouldn’t want to do that for every output, but if what you see in FCP and what you get in export don’t match, do this.

So that’s my lesson today: all ‘final final final final’ movies should be rendered out from scratch in a lossless codec otherwise the ghost of renders past will be hanging around like a fart in a space suit.

Graphic artists borrow, artists steal

Sorry to inaccurately paraphrase Mr Picasso, but I’m trying to excuse the fact that I’m watching Top Gear. With a notebook. I’m pretending to be a petrol head, but I’m watching the editing like a hawk. Or a hungry chicken. Whatever.

I’m watching it through the BBC iPlayer. I can’t stand broadcast TV any more – I don’t just want to pause live TV, I want to be able to stop it, walk away, mow the lawn, make wife some tea, watch something else, then pick up where I left off. Frame by frame if I feel like it. iPlayer rocks even if it’s a fraction of the technical quality of broadcast TV. Broadcast TV and schedules and adverts and ‘did you see last night’ are so dead… But I digress.

It’s automotive pornography, it’s without any useful educational content, it’s rather divisive (ho ho! In the extreme dear friend), but the pictures are pretty and the editing is exciting. And there’s lots of bits of metal that go ‘brum’ loudly.

I don’t edit long segments about cars, but why am I occasionally pausing the video and going through it frame by frame, working out what’s done in-camera and what’s in post? Why am I making mental notes about ‘the sound of transitions’? I’m analysing the number of frames in sequences and in beard stroking moments, watching how edits contain more and more sub-15-frame content and ‘glitch’ edits. Pixelation is no longer a ‘whoah’ thing, it represents DV drop-out to the audience, a moment when tape and drum did not connect. So camera shake and rolling frame is out. A ‘blik!’ sound effect and a few random pixellated areas and perhaps a flash frame or two.

Our children are watching this too. My four year old son thinks that fire engines go ‘nee nahh nee nahh’, but of course they don’t – they never have in his lifetime. They go – well, you know what they do. It’s like green screen text in The Matrix – does anyone under forty who is NOT a geek know what a command line is? More to the point, what will my 4yo son make of film scratches, film jumping the gate in projectors? He sees glitches and freezes, he hears the ‘stut-stut-stut-stut-stuttering’ of internet movies starting up in iPlayer. The buddy-blocks of blown bandwidth.

It doesn’t stop there. He associates the ‘washing machine from hell’ flash-loading icon with a busy day for the internet. Ye gods, forget the Oracle of Delphi and the ides of March, we have the spinning beachball of death and the washing machine from hell to tell us it’s a bad IP day for mortals.

Top Gear has been educating an audience with a visual style that’s abrasive (like rinsing your eyes in mouthwash), fresh, dynamic (and very IP unfriendly) – and I’m finding my edit style adapting to match. It’s all very ‘now’, very ‘cold shower’, very ‘mouthwash’ and ‘9 volt battery on your tongue’.

So when I had the chance to show my 70+ aunt my current show reel (it was that kind of afternoon), she got it totally.

Which leads me to a giddy pontification: when octogenarians are totally into blipvert editing, where do we go next?

Death of a hard disk

I had a hard disk failure the other day. MacBook Pro. Happened without warning when on the phone to client. No warning. Just like a stroke. Sudden, devastating, terminal.

The next day, I’m due to fly out for an EX1 shoot, requiring the transfer of its SxS cards into a compatible device – the MacBook Pro being perfect for this. I do have a backup Mac – a MacBook, but it has no PCI Express slot. And as backup machines go, it can’t really do Colorista, Motion, DVmatte Pro (all require GPU). And so although I have software backups, the hardware isn’t really a backup. So within three hours of the Spinning Beachball of Death, I owned a brand new MacBook Pro 17″.

Lesson 1:
If your income is dependent on a certain type of computer rather than a computer per se, have two of them. Not a posh one and a skivvy one. My backup was a helper, an ‘it’ll do, it can help out’. But if I need to load up a project full of colour correction, esoteric codecs, and (since my main machine is dead and gone) my current copy of Final Cut Pro, a skivvy computer will say ‘no’.

I call AppleCare – I purchased the full-on AppleCare package for my Mac. Sure, they will take it away, replace the hard disk and send it back – without transferring data, so it will be basically factory fresh. This may take up to three weeks. Three WEEKS? Three days would be a disaster. Apple’s response will be ‘just use your backup’. I pay for rescue: if my car breaks down, I call a number, and somebody arrives within an hour or two and gets me going. I thought I paid for this service for my Mac, but no. AppleCare is not the AA. I wanted to turn up at a shop, for some kind person to rip out the old drive, put a new one in, and hand my Mac back with a hard disk I could have in a USB enclosure if it should ever work again. What I got was 30 minutes of phone support reiterating everything I’d spent two hours doing, and a courier firm who always phones when I’m out and doesn’t want to call mobile numbers.

I’ll have the old MacBook Pro repaired, and it will be backup hardware for when my main machine calls in sick.

So, I now have a brand new MacBook with a nearly empty hard disk. I also had loads of backups spread across 30+ hard drives – but I didn’t know which one because in order to use the disk cataloging software, I’d have to install it in the new machine. So I did, and I found a recent one. Great. I attached the hard disk and used Migration Assistant. Oh.

When you set up a brand new Mac, you’re asked to set up a user account, which I duly did. This is me, this is my password. Okay, so it was the same as the previous one, otherwise the new software would be frumpy being on a new Mac and all… But I can’t restore my old self, as I can’t restore my old ‘Me’ over my new ‘Me’. I don’t want to make a new me (Me1 rather than Me) as it sounds so lame to be a secondary also ran on one’s own computer.

So that leads me to Lesson 2:

When in possession of a new Mac, make TWO accounts. The first one is a disposable admin account. Nothing to see, nothing to do. Most importantly: It is NOT personal. Just make it as plain as you can. From that, THEN restore your personal account – your avatar of Macness – into the virgin machine. The funny thing is that this is Computer Administration 101 stuff. Of course that’s how corporate machines are set up. I should know that – I used to do it myself. But somehow, when you’re a one-man-band, the lessons of Big Corporate IT don’t seem to apply. But they do if you don’t want to… Spend the next X hours reducing your brand new Mac with patiently set up software back to its virgin state so you can try again…

Right, so the new Mac is virgin again and as it boots, it swirls the Apple Welcome message. I set up an Admin account. I can now migrate my old entity to the new machine.

This can be done from a Time Machine drive, or from all sorts of third party solutions like SuperDuper (http://www.shirt-pocket.com/SuperDuper), but then there’s version control. If your backups are spread across many drives to ensure no single point of failure, which one do you use? Well, from personal experience, not the one with the most recent modification date.

I restored from a backup that appeared to be from a week ago, but actually was six weeks earlier than that. Okay, no problem, I thought. I can restore other stuff to make up for the time difference. Email, documents, etc – they’re all done separately, so no problem there.

But here’s the crunch: I’d inadvertently restored from a backup BEFORE a major system change. Thusly, I had re-inherited all the little issues my updates had cured. Due to the time pressure of getting things done now, getting up and running as soon as possible, I had got from hard disk failure and attempts to rescue, to fully operational machine fresh out of shrink wrap in a day. No data loss, no info loss, but…

That leads me to Lesson 3:

Look after your tools like you look after your data. We all back up our work. It’s critical. Redundant backups everywhere. We lose no data. Tools? Heck, I’ve got the DVD install disks. I’ve got the URLs and the serial numbers. My hard disk’s tool kit is backed up every so often – bleah. Whatever. A fresh install cures all.

Fresh installs take time – lots and lots of time. And pain. And frustration. It’s a chance to make the previous installation BETTER by applying learned lessons.

So I restored my data from fresh backups and it’s all good. I restored my tools from a six week old backup, and it’s pants. I inherit a whole lot of dross that I solved ages ago. Back up your tools like you back up your data.

Lesson 4:

I love SuperDuper so much, it’s got me out of nasty situations and helped me no end. I don’t trust Time Machine as I need to know that there’s no silly gotchas in the restore process. But here’s the kicker, folks: If I had Time Machine running on a cheap USB drive when I was working at home (SuperDuper does the abroad stuff), I’d have saved the four hours it’s going to take me to reinstall FCS, Leopard and the rest to make my tools work as they should ‘out of the box’.

Summary:

  • If you earn money from your Mac, own two of the same (or thereabouts)
  • Always have two accounts on your Mac: You and Admin
  • Don’t get obsessive about backups – get regular about everything
  • Time Machine is better than it looks

 

But on the other hand, my new machine was budgeted for, has twice the RAM, twice the Hard Disk, twice the GPU. Sorry I didn’t get a maxed out iMac or a base line Octo-Core Mac Pro – but that’s how the Education By Fate works: great lessons, tuition bills are kinda high.

The Old Army Colonel And His Son On Holiday

Another edit is in the can. In approximately 9 hours, I’m off to do a Z1 shoot and really wishing it could have been an EX1 job. Okay, so I could shoot HDV – but the shots would stand out like trout in a fishbowl.

Whatever. No. Tonight, whilst I compress FLVs and upload them, I’ve been going through Ripple Training’s Deep Dive course – all about Motion’s 3D. Lots of Alphabet street and ‘which way is up’ moments. Ages ago, I did a fair bit of early After Effects and even Specular’s Infini-D, but often got lost. I had to get some ‘Doe, a deer’ rules.

So I thought I’d share these silly mnemonics for those who only sporadically dip their toe into the 3D universe:

“X is a cross” (as in across, left and right, geddit?), so therefore Y is uppY downY. And “Z is like Zoom” in and out. Not strictly accurate, but when it’s late and you’re reaching for the right (wrong) slider…

And for Motion,

“Red-X” or “Red Cross”, “Green trees grow up”, “Blue oceans into distance”.

Of course this is all second nature to Motion Graphics designers, and sure – I can reel off pixel aspect ratios and Composite modes in a flash, but if you’re not doing it every day or even every month, we all have those ‘The Old Army Colonel And His Son On Holiday” moments.

If you still remember your Cosines from your Tangents, that is.