iTalc – UltraVNC and Installation

We use a cool piece of open source software at work in our labs called iTalc that allows teachers to monitor all of the computers in a given computer lab. This not only lets the teacher keep an eye on the kids and what they’re doing (both to keep them on task and to make sure they’re doing their work correctly), but it also allows them to lock each student PC screen when the teacher wants their attention and allows the teacher to project his/her screen to every student PC or use a student PC as an example and project that screen to every other student. Like I said, it’s a very cool application.

The other day my co-worker asked me to work with him on a new lab setup because he was having issues with iTalc. I wanted to document our findings for the future and for others who might be having similar issues (since we couldn’t find too much about iTalc out there on the Internet – though they do have a wonderful support wiki and EduGeek has good forums).

  1. iTalc uses VNC as the underlying technology and we use UltraVNC for remote support.
  2. When UltraVNC and iTalc are installed on the same system, they fight over the port (5900) and UltraVNC seemed to win every time – the iTalc agent would just keep closing and re-opening itself.
  3. iTalc 2.0.0 can “supposedly” be told to run on another port (same with UltraVNC), but we didn’t have luck getting the iTalc agent to run on a different port.
  4. We had to uninstall UltraVNC on the lab PCs on the network. Since we deploy UltraVNC via a group policy, we had to deny the application of that policy to the computers in the labs then uninstall UltraVNC.
  5. We installed the latest iTalc agents, version 2.0.0, on each PC in the lab (they were Windows 7 SP1 64-bit computers).
  6. The last system I planned to install was the teacher station, or the iTalc master, which also needed to have UltraVNC uninstalled, but is a Windows XP SP3 32-bit computer.
  7. iTalc 2.0.0 32-bit did not run successfully on this PC.
  8. After much research and testing, I found others complaining about iTalc 2.0.0 on a 32-bit system. I finally found someone who had a version of iTalc 2.0.0 32-bit that had a different date associated with the file than what was posted on Furthermore, this other installation file asks you to install the “Babylon” search engine (which you can say no to). If I ever find where I got this file from, I’ll post a link, since I can’t seem to find it right now.
  9. iTalc works!

Lessons learned are: don’t run iTalc and UltraVNC on the same computer, upgrade/test the teacher PC and one student PC at the same time (having already done all of the student PCs and not being able to get the teacher PC to work was not fun), keep teacher and student PC software versions consistent (Windows 7 32-bit vs. Windows 64-bit), wait for iTalc version 2.0.1 to come out 🙂


Apple and Google Service Status

I love service status webpages – there are so many services users rely on every day that are in the cloud and it’s nice to be able to determine whether a particular service is having issues or not.

I’ve known about Google’s status page for a while:

But more recently I came across a link to Apple’s “cloud services” (aka iCloud-related): I found the website when there was an iMessage outage a month or more ago. Granted I already knew about the iMessage problem, but it was nice to know Apple could confirm an issue.

Microsoft has one for Office 365 too, but you need an account to view it, so I won’t post a link.

mRemote Update Script

I have been using a piece of open source software called mRemote (multi-Remote) for a few years. mRemote is an application that allows users to manage multiple remote desktop connections. While it has many useful features (i.e.: it can connect to SSH, VNC, and RDP from one interface), the main features I like about mRemote are: tabbed remote desktop sessions and the creation of a configuration file to “bookmark” your connections. Unfortunately, the development of mRemote had ceased when the primary developer Felix Deimel started working for a company with a similar (paid) product called Royal TS. Personally, Royal TS sounds amazing (cross platform, syncing of configurations, etc.), but my company won’t pay for it.

When I arrived at my current company, I installed a new fork of the old open source mRemote application called mRemoteNG (multi-Remote Next Generation). I set up the application with all of our servers in a configuration file, but wanted to share with my co-workers, so I wrote the .bat file below to allow my co-workers to share my custom configuration file. This script is very simple and asks for input (u for upload, d for download, and q for quit). You’ll need to modify the script to work in your environment (change the variable for the remote path and confirm the variable for the local path) and you’ll need to make sure you have a Windows share setup that other users can access.

If you’re not planning to share the configuration file, this script could also be used to backup a configuration file – you could change the paths to a personal share or a DropBox folder.

@echo off
REM Author: Your Name Here
REM Date Updated: Date
REM Purpose: CLI for updating an mRemote configuration file.

REM explain usage
echo u - uploads your local mRemote configuration to the server.
echo d - downloads the server mRemote configuration to your computer.
echo q - quits program.

REM define global variable for server path
set serverpath=\\server\share$\path\to\config\file\confCons.xml
set localpath=C:\Users\%username%\AppData\Roaming\mRemoteNG\confCons.xml

REM go to startingpoint function
goto startingpoint

REM startingpoint function - displays options and directs to other functions based on selection
     set /p task="Press (u)pload, (d)ownload, (q)uit: "
     IF %task%==u (goto upload)
     IF %task%==d (goto download)
     IF %task%==q (goto done)
     IF NOT %task%==d IF NOT %task%==u (goto notvalid)

REM download function - downloads the configuration file from the server to the local hard drive (and creates local backup)
     echo Creating backup and downloading file ...
     copy /V /Y %localpath% %localpath%_BACKUP
     copy /V /Y %serverpath% %localpath%
     goto done

REM upload function - uploads the configuration file from the local hard drive to the server (and creates remote backup)
     echo Creating backup and uploading file ...
     copy /V /Y %serverpath% %serverpath%_BACKUP
     copy /V /Y %localpath% %serverpath%
     goto done

REM notvalid function - if a user does not press u, d, or q, function explains usage again and takes back to starting point
     echo Not a valid choice ...
     echo Acceptable choices are:
     echo u - upload
     echo d - download
     echo q - quit
     echo ... Please try again.
     goto startingpoint

REM done function - exits program

Time Machine via AirDisk and CrashPlan+

I’ve been struggling with my personal backups for a while. I use a MacBook (white 13″ mid-2007, OS X 10.7.5 Lion) at home as my primary computer and have always used Time Machine. How I use Time Machine has changed over the last few years:

  • I started using Time Machine with a locally connected external portable USB Western Digital hard drive. This always worked well, but required me to remember to plug in the external HDD to run a backup.
  • I bought an AirPort Extreme Base Station (AEBS) for Christmas 2011 and connected my USB printer, my wife’s backup HDD, and my backup HDD via a powered USB hub. In doing this, I was able to share out each hard drive independently and re-seed our Time Machine backups. This worked great and automatically backed up our data hourly – no physical connections required!
  • I found a Good Friday deal this past year (2012) and bought a 1-year subscription to CrashPlan+ for 98% off (something like $2.40 for unlimited data and up to 10 computers for the year). I always liked Time Machine, but figured I should have an off-site backup too and for under $5 for the year I couldn’t pass it up.

My local wireless Time Machine backup combined with my offsite wireless CrashPlan backup seemed like the ultimate solution, and it was for about 10 days.

Starting about a week before Christmas, my Time Machine backup stopped working. After much troubleshooting, I realized that I couldn’t even connect to the AirDisk shares on my AEBS. Rebooting the base station fixed this problem temporarily. Reformatting my HDD and re-seeding my backup only worked as long as it took to finish the initial backup, then continued to fail. Again, I couldn’t connect to the AirDisk shares without rebooting the base station. I did more research and tested more theories, but Time Machine would never continue to run for long. I finally shut off Time Machine and left it off for a few days. During this time, I could connect to the AirDisk shares anytime I tried. Once enabling Time Machine again, it ran for a few backups, then failed again.

Eventually it dawned on me that Time Machine stopped working around the time I started using CrashPlan+. I disabled CrashPlan+ and Time Machine has been running successfully for about 4 days now, no hiccups whatsoever!

It seems that much like having more than 1 antivirus solution running on the same system isn’t good/recommended, multiple backup solutions running on the same system also doesn’t appear to be the best idea. Unfortunate too, CrashPlan+ has an article about how their product compliments Time Machine. Personally, I will rely more on Time Machine since I have it set to backup everything on my hard drive. I only used CrashPlan+ to backup certain areas of my hard drive that I wanted offsite. So, If i can keep Time Machine working I’ll be happy.

Some additional ideas I wrote down for troubleshooting that I haven’t gotten to try yet include:

  1. Wipe AEBS settings/configuration – I already downgraded firmware to 7.6 and then re-applied 7.6.1 when the downgrade didn’t work.
  2. Test wireless Time Machine backup without USB hub and enable CrashPlan+ again.
  3. Test directly connecting my portable HDD to my laptop and running the Time Machine backup again with CrashPlan+ enabled.

If I ever find out exactly why the two technologies don’t work together or if I ever discover a workaround, I’ll post again. One theory I had was that there could be an issue if they’re trying to backup the same file at the same time. I couldn’t find a way to schedule either (for example, running them every other hour), though I’ll admit I didn’t look too hard yet.

Using ‘nslookup’ with multiple DNS search suffixes

At work I need to use a custom DNS search suffix list on my primary network adapter so that I can resolve hostnames on my primary Active Directory (AD) domain and so that I can resolve hostnames against a secondary AD domain that is also on my network.

While I have been running with a custom search suffix list for a few months now, I just found a small problem with the setup: when using the nslookup command to lookup up an IP address against an external source, the last DNS search suffix in the list was appended to everything I looked up. For example, if the suffixes in my list were and, my commands looked like this:

Default Server: <local DNS server>
Address: <local DNS server address>

> server
Default Server:


*** can't find : Non-existant domain

I had no idea why the kept appending itself to the results. Eventually, after doing some research, I found that this was “normal” with the nslookup command in Windows and in order to get around it, I’d need to add a period (.) to the end of the request. For example:

Default Server: <local DNS server>
Address: <local DNS server address>

> server
Default Server:


Non-authoritative answer:

Mystery solved! I can use nslookup again to resolve against an external DNS server if I end the request with a period. Also to note, that technically a fully-qualified domain name (FQDN) ends in a period anyways – it’s just usually assumed.

The stereotypical “first post”

I decided to create a blog.

This blog is not necessarily to attract readers, but it is to chronicle various technology-related tips, tricks, issues, and solutions that I’ve stumbled upon over the years.

Being in the tech industry, I often come across complicated (and some not-so-complicated) problems and finally wanted a way to keep track of them along with their solutions. Hopefully this blog will help others who might have similar problems and will also allow me to remember the answer to: “I’ve seen this problem, how did I fix it last time?”