Since about 2015 or so, it seems like there are always Raspberry Pi’s lying around the place, at work and home, just waiting for some project to come along to put them to use.
One of the Pi’s looking for a job as part of my home lab has found just itself retasked as a Plex Media Server following a very simple process.
A new Operating System was written onto the SDCard, this time out the OS chosen was the Raspberry Pi variant of Ubuntu Server order to have a more consistent experience across systems being used lately.
Once up and running and fully patched, the Plex repo was added and the Media Server installed via apt.
Directories were added to host the media library and the Plex configuration ran. Once complete it was a simple matter of connecting the TV to the new media server via the Plex app and enjoying the content I had that was driving this little endeavour in the first place.
I heard it said recently that a Raspberry Pi with decent enough storage can be purchased for less then the price of running an EC2 instance for a year, so when it fits the use case a Pi can be a great solution and even in the age of cloud there are still times when it’s better to run locally (streaming dodgy media being a classic).
Far too often in home lab situations, where there’s some interesting idea to be explored, efforts get derailed as a result of sinking too much time into the environment setup particularly when something like Oracle Database is involved. Maybe the the initial idea is to play with data migration or to investigate some new feature or just to brush up on skills, but whatever the scenario there’s a need for an environment and then, just like that, a week has passed and you’re just about ready having hunted down all the obscure errors and issues thrown up as the setup has progressed, when you get distracted by something else as the idea you wanted to work on has been crushed by the weight of getting things ready.
Finding a good excuse to learn a new technology is something I’ve been interested in for a while particularly as the pace of change and technology adoption only ever seems to be increasing. Knowing that the IT world is in an ongoing state of flux, finding an engaging way to learn about the tech that technologists are expected to know is more than idle curiosity, it’s essential for career success.
Revisiting Chatbots, a topic I encountered through a Wired article from summer 2015, I discovered that getting a chatbot up and running, and more importantly, useful and usable, involved a wide variety of different technologies. As the article was heaped so much praise on Github’s bot, I decided to try it out with a view to having it do some useful tasks around AWS administration, one of the two challenges that had been put up to our 2016 interns.
Employee engagement is a big issue for any company, put simply disengaged employees leads to departing talent, so it’s vital to constantly be looking for ways to develop teams, especially those working with technology, in ways that result in high levels of ongoing engagement.
Oracle very kindly offered to run through a Proof of Concept installation of their GoldenGate replication product with us, and seeing as how we’re not prone to looking gift horses in their mouths, we’ve accepted!
Quiet Friday’s are a nice treat but something too often squandered. This past Friday I found myself left to my own devices as the rest of the DBA team were working from another location. This left me free to get a few things about running our environment straight in my own mind with the most notable being was how to use CommVault for database backup and recovery beyond simply copying RMAN output to tape.
// Data Guard // Database Incarnations // Standby Logs applying in Alert Log but not in V$ARCHIVED_LOG
Over the past week I’ve been getting a great introduction to the practical workings of Data Guard. In the past I’ve worked a lot with Disaster Recovery systems built using Oracle standard edition and therefore not licensed to use Data Guard and in those circumstances a poor man’s version of the system was put in place by using RSYNC to synchronise archive logs between a production database and a DR database. With the logs in place, a script run on a schedule would recover the data from the logs. It turns out that the concept of Data Guard is pretty similar in that it’s basically about getting archive logs to the right place and setting the destination database into a mode where it can read those logs and be ready for when disaster strikes.
The programmatic nature of Oracle’s Recovery Manager tool provides great flexibility but also creates a common problem of insecure scripts being left running on production servers allowing anyone to read critical password details with ease. Continue reading “Securing RMAN Scripts”
Shared Storage systems are very common in enterprise computing but it’s not cheap or easy to get access to the technology for eduction or testing purposes. In this posting I investigate if a system like Openfiler can work in those circumstances, and if it’s worth the effort.Continue reading “SAN Simulation with Openfiler”
Last week I found myself getting defensive with a colleague of mine in a conversation about the value of system prototyping. My colleague suggested that there’s a difference between IT people and everyone else and that it’s that two IT guys can look at a diagram and see how a system will work while everyone else needs to see the system in action. There is perhaps an element of truth to this but like most things in life I think it depends on the people involved (which is why I leapt to the defence of IT people everywhere as I don’t like generalisations being made that somehow mark us out as different). Sometimes I can look at a diagram and make the imaginative leap to how a system might work. There are other times when I like to play with a piece of software or whatever to get to know it, and at the end of the day there’s nothing like using a thing to understand how it operates.
Also last week I was discussing Oracle Warehouse Builder with someone. OWB (as it’s known) is one of those systems that I’ve experienced but would like to get more experience of and it offers some functionality that might solve a nasty little problem many businesses suffer from which has bothered me for a while.
The I.T. industry is made up of lots of companies who declare themselves to be solution providers of varying sorts but I wonder how many times a prospective customer really goes to one of these firms and says “I have a problem” and how many times the provider actually has to go off and come up with a solution.
Paracetamol is a wonder drug. Whenever I get the man flu (which can kill!) I have found that plain old paracetamol works far better than the various cold and flu remedies offered at outrageous prices at the local pharmacy. The way that particular medicine can deal with a man flu and the crippling symptoms that come with it is truly amazing. While many think that the age of wonder gave way to the age of reason a long time ago, I still find wonder in how things work, be that painkillers or computer networking systems. Continue reading “Dynamic DNS or “New Adventures in Old Routers””
When I was in college our Programming lecturer cautioned against a career in I.T. due to the need to keep up with the constantly changing technologies such a career entails. Personally, I thought that sounds brilliant, always something new to learn and play with, bring it on, I said. Later on in my education I encountered other lecturers who said that yes, keeping up can be difficult as well as fun but there are some fundamental skills you can learn that transcend specific technologies (which is why good developers are able to learn several languages as long as they have a good grasp of the principles of programming).
I.T. is constantly changing and with that pace of change some technologies come into and slip out of fashion quite quickly. As technologies like smartphones and other internet devices become accepted by wider audiences than traditional I.T. people, fashion plays an increasingly important role. For example, the iPhone. Continue reading “Hotmail & the iPhone”
I’ve always had an interest in knowledge management and how tacit knowledge in particular can be captured within an organisation. I once read about a management consultancy that use a specially constructed database to store information about the different industries and past projects the consultants have worked so that less experienced consultants can access the knowledge of the senior staff when dealing with situations unfamiliar to them. The value in a management consultancy is in the experience (and therefore knowledge) of its consultants so the database they use is an incredibly valuable resource that forms an integral part of the offering to clients. It occurred to me that a database like this is pretty valuable to any type of knowledge worker, in my case I.T., so recently I set about implementing a personal level version of the management consultancy knowledge database.
An Inconvenient Mess: Al Gore’s office space and his unique system of knowledge management
A pdf version of this posting can be downloaded here.
Websites, in all their different forms, are hosted on web servers and the Apache web server is one of the most popular currently available. It would be incredibly inefficient to only host one website per server, particularly in a commercial web hosting scenario, but for designers and developers at any level there is often a need to work on different sites, or different versions of the same site, in the same environment. In order to make the most of your resources the logical approach is to host multiple sites on one installation of Apache server.
On Apache each website is treated as a Virtual Host, with the concept of hosts relating to how DNS is configured in order to route traffic to a website. When a web browser requests a website via a URL the request is handled by DNS which knows that, for example, http://www.somesite.com relates to a specific IP address – the address of the server that’s hosting the site. DNS forwards the traffic to the server which in turn responds with the information requested. As far as DNS is concerned the URL of the website bound to a specific IP address is just another host in its database.
I’d been looking for an excuse to try this out for a while, but as with such things, a reason presented itself that REQUIRED me to try Kon-Boot in order to get onto a Windows machine that I didn’t have a password for.
The situation is this – a typical problem that presents itself to administrators is new starts with no notice, that is, a new person is coming into the company on Monday and I was told about it today. Today is Friday. The person is only going to be here for a short while but they still need a PC and access to the company network, so I had to rustle something up for them. There’s been a spare PC on my desk for a while now and this seems the perfect opportunity to get rid of it for a while, however when I powered it up I realised why it was there in the first place. It works, but no-one can log on as no one has a clue what the local password is and it won’t connect to the company domain so those user accounts are no good either.
I’d seen Kon-Boot on an episode of Hak5 and had sworn that I’d get round to trying it our for real. Not too long ago a friend of mine contacted me asking for advice on how to deal with the problem I’ve just been presented with and I suggested that he try this naughty little piece of software that’s designed to get you through the pesky security on a windows computer. I’m not sure if he ever tried it but I vowed that I would.
So, this very afternoon I was finally given the excuse I needed to do this (legitimately) at work. I searched for the website (see links below) and downloaded the iso image for the Windows version. This I burned onto a disc which I used to boot the PC in question. Upon boot, you are presented with an old school boot screen that presents the credits for the developers of the software. This reminded me of the credits that used to go at the beginning of old Amiga games that had been craked and were a favorite of the kids at the school I attended back in the day – if you take a look at the Kon-Boot website you may notice some other references to the old Amiga systems.
Once you get beyond this screen another, similar screen lets you know the system is loading. From this point you are in familiar terrirtory as the XP loading screen is presented and the computer gets to the CTRL ALT DELETE prompt as normal. All is far from normal however, as once you press those three keys you can put in any old muck and the system will log you on.
That’s it. You’re in at that point and free to do whatever you please.
Fist impressions are very favourable, though the device is in late prototyping and MS aren’t even near an official announcement (all the details on the web are the result of a leak from Microsoft).
This could be the form factor to put a much needed spark back into Microsoft and the computer world in general. If nothing else, it looks like the computer book Penny had in Inspector Gadget, so that’s cool!
Here’s an annoying little error that I’ve just come across.
I’ve tried to send an email from Outlook and I’ve received a reply from “System Administrator” with the word “Undeliverable” in the subject field along with the subject of the mail I tried to send.
In the message itself I get the following:
Your message did not reach some or all of the intended recipients.
Subject: Out TEST 1
Sent: 09/10/2009 14:49
The following recipient(s) cannot be reached:
‘email@example.com’ on 09/10/2009 14:49
553 sorry, that domain isn’t in my list of allowed rcpthosts (#5.7.1)
The message is a little bit cryptic as the idea of a list of allowed recipients indicates that there’s a list of disallowed recipients and that might lead you to believe that your firewall or anti-virus systems is acting up and doing more than it’s supposed to.
What’s actually happening is that Outlook is trying to tell you that your outbound email settings are wrong. In my case it was the “Outgoing mail server (SMTP)” setting that was wrong and this had been caused by my changing Internet Service Providers. My new ISP has a different SMTP server and I needed to tell Outlook what this servers name was.
In order to change the SMTP setting for a given e-mail account in Outlook you need to do the following:
1. In Outlook go to Tools > Account Settings. Select the account you want to fix and click on Change.
2. About halfway down the window that opens is a section headed “Server Information”. In the third box down enter the correct name of the SMTP server (this can be obtained from your ISP). Click Next and Finish and that’s it!