To use this website fully, you first need to accept the use of cookies. By agreeing to the use of cookies you consent to the use of functional cookies. For more information read this page.

Jamie Balfour'sPersonal blog

For a fairly long time (since I moved into my halls of residence in 2013) I've been obsessed with something more than computers. Phones.

I'm not talking about smartphones, I'm talking about IP phones. VoIP or voice over IP is something I've been interested in since about 2003 when I got to use Skype for the first time. I found it incredible that we could do these things and Skype just kept getting better and better. But Skype had and still has one problem - you need a PC or some other device running the software to use it. 

As I mentioned, in 2013 I got my own IP phone in my halls of residence. I had longed to be able to test one like this so it was really awesome to get to test it for the first time. I had a Cisco one if I remember. About early 2014 I spoke with my father about getting an IP phone for the house and moving away from the standard PSTN (Public Switched Telephone Network) and with both of us being in some part of the computing industry (we both work with networking regularly), it seemed only right that we now trial this technology at home. In both of my previous offices, I got the wonderful experience of using IP phones, in particular, my last job, where we used Grandstream phones which had all the bells and whistles. I looked the phone we had at work up and to my surprise, it was only £70 odd to buy one on Amazon.  

Last Friday we officially began moving to IP phones across the house (and it's all been down to me to do it, but hey, I guess I've managed, even though it's been completely a new concept). SIP, or session initiated protocol, is the backbone of our new network. Our network is powered by a Raspberry Pi which handles all outgoing calls currently and will soon deal with incoming calls. Our original DECT analogue phones will be slowly moving to SIP, with one set of them moving this week and one will remain on the analogue line until we transfer entirely to SIP. My bedroom will use my new Grandstream GXP2170 (which was only £100 so only £30 more than I was going to pay and I got a much better phone). I'll also be finally getting my own private number in the house. 

As it happens, BT wants to switch the whole country from the basic PSTN to an IP based network to make it more competitive and it's a good move too. There's more about this change that BT wants to bring to us here.

I've finished setting up my SIP network. We've got a trunk for both inbound and outbound numbers. Asterisk on my Raspberry Pi deals with all calls and forwards them to all of three extensions (though we have more so we will need to fix that) in a ring group. If the call is a business call, i.e. one for Jambour Development, only two phones ring. If it's for the house, all phones ring. We also have voicemail and out of hours setup. 
Posted by jamiebalfour04 in Tech talk
sip
session
initiated
protocol
new
technology
gadget
fun

It's often been laughed at by my friends, but now it was too much (or too little). Our broadband connection was abysmal - often only achieving a paltry 2Mbps (0.25MBps). This meant that simple things like SSH couldn't even work properly and would often timeout. 

I will point out, where I live was one of the pilot locations for broadband when it first came to the UK, so we had faster broadband than quite a few people at the time. On October 26th we decided to complain about how slow our internet connection has become. During a complaint about the speed, we were told that an engineer would be out to have a look. The engineer did arrive and mentioned about the speed that we were able to get superfast broadband (up to 56Mbps or 7MBps) now according to the system. We were, of course, delighted, although a remained somewhat sceptical about this, but we decided to move to it straight away.

Now the fun starts. We were originally promised that it would be Tuesday the 31st of October and there would be absolutely no disruptions to our service until then. We were sent a BT Home Hub 5. Then within a few days, a new BT Smart Hub (aka the Hub 6). We were told to disconnect the old router and connect our new BT Smart Hub. We immediately lost our internet connection. On the 31st of October, after getting all excited for our new superfast connection, I was let down. 

On the 2nd of November, after having had no internet now for a week, meaning no maintenance on my web server or anything and only one backup downloaded in that time (thanks to my friend Calum, who allowed me to download a backup at his place using his superfast connection) I was beginning to get weary. We were annoyed that we had no superfast on this day but when we phoned BT to find out why, there were a bunch of excuses, but in the end, we figured out the main problem - the line wasn't supported. Wait a minute, didn't the say it was?

What BT said originally was not true, our house was not suitable for fibre; because we did not have a connection to the cabinet but a connection direct to the exchange, bypassing the need for the fibre-enabled cabinet. 

So we were then promised the following Tuesday (7th November) for our superfast connection. I again remained extremely unsupportive that this would be all done and dusted by then. My dad, on the other hand, kept firmly believing that this would happen.

As you guessed it, we didn't have it on Tuesday 7th of November, but we were told it would be there on Thursday the 9th (the day I was originally meant to be moving it my own house but that's for another story). We were then told on Friday the 10th that it wasn't going to happen then and would definitely be in place by the next Tuesday (14th).

All in all, I'm just so angry with BT for messing us around. I once believed they were a good company but this has changed my opinions entirely.

I'm sure that a lot of you will remember a lot of revolutions in technology, but perhaps the biggest one was when we moved away from parallel ports, slow serial ports and eventually even PS/2 and moved to USB. Eventually, the competition started to appear such as FireWire and eSATA, whose focus was mostly on storage as opposed to general (or universal) use. Both have pretty much vanished. eSATA attempted to side with USB by forming a combo port, but unfortunately did not last, partly due to the lack of robustness in SATA and eSATA cables and partly due to the fact that SATA as a standard has all but been replaced by PCI Express.

We all knew that when PCI-X and AGP were superseded by PCI's younger sibling, PCI Express, that it had come back with a vengeance. Boasting 2.5Gbps across a single lane all the way back in 2004, PCI Express was bound for success.

A single lane PCI Express card

But it's not until now that one of the things I've dreamt of has become a reality. Back in 2011, I proposed on my blog an idea to expand PCI Express into external GPUs through the use of ExpressCard - an exceptionally clever use of both PCI Express and USB to make a single standard capable of multiple speeds (sounds familiar, right?). Of course, eGPUs did exist for ExpressCard, but they were slow and cumbersome - not something many people would want.

Behold, Thunderbolt! Thunderbolt was original only capable of 10Gbps, which is the speed of a single PCI-E version 3 lane - that's a 16th of a desktop graphics cards maximum bandwidth. So next came Thunderbolt 2, capable of twice the speed, we're now talking about 20Gbps, and eGPUs were now actually possible. The problem with Thunderbolt 2 is that there were very few eGPUs made. One possible reason for this is because the majority of computers featuring Thunderbolt 2 were Macs. Not many PCs were built with the capability to use Thunderbolt 2, and it makes little sense to make an eGPU for Mac users since the majority of us don't intend to play games on our Macs and use them more for productivity. 

Thunderbolt now uses the USB-C connector

Of course, the natural successor to Thunderbolt 2 was Thunderbolt 3. Prior to the announcement of Thunderbolt 3, Intel was working on improving the old USB standard to feature a more robust, smaller, yet more capable USB connection. Since USB Type A and USB Type B already existed, USB Type C was the name for this connection. USB Type C, or USB-C as it is often referred to, offered up to 10Gbps over the USB standard, aptly named USB 3.1. Within a few weeks of USB-C being announced, it was made clear that the new USB-C connector (remember, USB Type C is the name of the connector, not the standard) would also become the connector for Thunderbolt (Thunderbolt has always been lazy since originally it used the mini DisplayPort connector as the primary interconnect for Thunderbolt). 

USB-C as the primary Thunderbolt connector meant that this new connector offers USB 3.1 speeds of 10Gbps, DisplayPort speeds of 17Gbps, HDMI speeds of 18Gbps and Thunderbolt speeds of 40Gbps, all over the one connector. To me this is awesome. This means so much will get replaced by this connector. 

Let's take a look at what this connector is directly capable of:

  • Thunderbolt 3 - x4 PCI Express Generation 3
  • DisplayPort 1.2
  • HDMI 2.0
  • USB 3.1
  • VGA - through DisplayPort
  • DVI - through DisplayPort or HDMI
  • USB 3.0
  • Native PCI Express cards supporting so many connections such as FireWire, eSATA, RS232, LPT and much more
  • PCI through PCI Express converter
  • Thunderbolt 1 and 2 devices

Because of the ability to connect straight on to the system bus (PCI Express), the system can indeed use many different PCI Express cards directly. 

So, whilst I originally was concerned that Thunderbolt would destroy all other connectors, the Thunderbolt standard seems to have added better native connectivity to older standards than before, which is amazing. One connector for all seems to finally be true.

Posted by jamiebalfour04 in Tech talk
pci
express
thunderbolt
express
card
expansion
future
connection
usb-c
type
thunderbolt
egpu

As a developer, there is one thing that is at the top of my list of things that I need to decide on - the text editor.

The development environment needs to be pleasing and make you feel comfortable (whilst developing Dash I feel quite the same way, if the content management system isn't user friendly, you can't be comfortable using it). I've been through a lot of editors - starting with a bunch of versions of Visual Studio, including Visual Studio 2005, 2008, 2010 and 2013. They are all brilliant and I'm glad that I made the choice to use them for about 7 or so years whilst I was a .NET developer.

Things changed quickly though as I became a developer based on Mac OS X. I was forced to find a new editor that suited my development purposes. When I stopped developing in VB.NET and C# and began developing Java, HTML, CSS, JavaScript, PHP etc. I found that I needed to find a new IDE that would suit those purposes. For the vast majority of those (all the web based ones) I used Aptana Studio 3. Aptana was brilliant but it quickly felt dated but I just could not afford the time to get a new editor without being certain that it was right for me. A good IDE needs to be extremely colourful (because that helps highlight different syntaxes), be fast and not prone to crashing (as Aptana eventually started doing) and be feature rich. For me one of the most important features of the IDE is support for SFTP. Aptana offers this out of the box. I then moved from Aptana to Eclipse with the Aptana plugin - pretty good to be honest. 

Eclipse is brilliant for Java development, and I still use it because it can compile a JAR file in so few steps, it can interpret and debug programs well and it just feels like it was designed for Java. However, Eclipse was eventually laden with the same bug that Aptana has and would crash from time to time - particularly when in the Web perspective.

So I made another move, this time to Adobe Brackets. I jumped on the Brackets bandwagon when it was pretty young, and I loved it. Syntax highlighting is lovely, it's feature rich and it's open source. Unfortunately, this jump was too early - Brackets just didn't have everything I needed. In 2015, I started an Adobe Creative Cloud subscription. As a result I gave Dreamweaver a try and I liked it (looking back, I don't know why I liked it really other than the fact it had SFTP built in). 

Introducing Atom

Atom is now my favourite text editor. After being introduced to it by a colleague at work, I feel like I've come to love it. It's colourful, well designed, doesn't crash and has everything I need from a text editor or IDE. 

Atom is my new IDE of choice

Why is Atom nearly the perfect editor though? Well my first reason is that Atom has clear colouring - it's dark interface clearly defines the background from the foreground and its syntax highlighting is bright and stands out well. On top of this, Atom features a plugin system that means that if the feature you want is not available, it's likely to be available as a plugin somewhere. Atom is fast - it doesn't slow down too much as files get larger - I'm talking about PHP files, which I always break into logical files which rarely exceed 3,000 lines. 

People may say what about Visual Studio Code, since being from a Visual Studio background surely I'd like that? Well yeah I do. But I found Atom to be even nicer.

I think that if you are reading this and looking for a new text editor with a beautiful touch to it, Atom is well worth a try.

If you have a different favourite, I want to know what your favourite editor is. 

Posted by jamiebalfour04 in Tech talk
atom
ide
new
visual
studio
code
web
development
developer

Whenever I am asked why I bothered building a personal website or why someone needs a personal website my reply is often something along the lines of 'it's fun' or 'it's my hobby'. But I very rarely touch on the benefits of my personal website.

There are a huge number of benefits to my own personal website. I get around 500 visitors a month on my website. I use it to showcase my work to potential employers, to get myself on the internet in a public way that people can connect with me through but there are also other things. I enjoy learning and teaching, so my website is also a source of information where I put tutorials to help others learn stuff that I know. 

But really what's the benefit? My first answer is that it's professional. The brand that my website pushes forward gives me a uniqueness that appears on all of my work now. The orange and blue theme of my website is also apparent on my CV, any letterheads I send and on certain emails. This looks highly professional and people like to see this. I also believe that having your own brand puts you above others who do not. 

The second reason that having a personal website is that the website is, well, personal - it's all about you. LinkedIn is great for connecting but it's full of other people too. Go to jamiebalfour.scot and who do you think you are reading about? That's right, some guy called Jamie Balfour. There's nothing about John Szymanski or Murray Smith on there (well there might be). This keeps the reader focused on you. You can write soley about how good you are and all of your achievements and yeah, be a narcissist, blow your own trumpet!

The third reason I would say having a personal website is a must is because it gives people an easy way to read about you. A personal website allows people to read about you from all corners of the globe. Social media is great, but it's also ladden with other things, like other people, a like LinkedIn.

I will admit my website is more of a personal project that evolved into something more. For anyone in computer science it's pretty nice to show that you can build a website from scratch, so I did exactly that (it shows a lot of perseverance too). 

Posted by jamiebalfour04 in Tech talk
benefits
personal
website
professional

Ultrabooks are amazing devices - combining high end mobile computing power with a slim design and decent battery life. The first ultrabook released was probably the MacBook Air, a moment I remember like it was yesterday. I've always really liked them and have always had an interest in them but never went out and got one.

In the last few years I've been following the development of one or two of them but one came to my attention - the Razer Blade Stealth. This ultra portable features two USB 3.0 ports and one USB-C port. The USB-C port also supports 40Gb/s over Thunderbolt allowing Razer to take advantage of the PCI-Express standard built into the laptop itself. As a result, Razer have developed the Razer Core, an external dock which features a full PCI-Express x16 slot in it. This allows you to insert a discrete desktop graphics card into the dock and use this as the laptops graphics processor. This is why this Ultrabook excited me so much.

Now the latest Razer Blade Stealth has arrived in the UK and is available to buy from their website.

The thing is I'm saving my money and until I decide to sell my MacBook Pro, if that ever happens, I cannot be buying this laptop. 

A couple of months back I was the victim to a website (not to be named) that was hacked and ultimately gave the information of it's users away that ultimately included my information. The reason behind this was that passwords were not stored in a effective manner. This meant that the minute you have access to the database you have access to all of these passwords.

What this now meant for me was that they had my email address and my actual address and began to subscribe me to many things I would never sign up to whilst also sharing my user name and password details on the web. It's a cold and horrible thing for someone to contemplate doing because I had done nothing to them in the first place for them to launch an attack at me. And to be honest the website who was a website who's sole duty was to help others - so it's pretty cruel to do that. Anyway, storing details about people in a secure manner is an important factor of online security.

What an unsecure database may look like

In a world where security is not a thing, algorithms such as the SHA (secure hash algorithm) would not exist in the field of security. In fact, the field of security would not need to exist. But unfortunately, because there are people who want to either steal something or just for the sake of it damage something, we have to compensate for this by developing secure ways of storing information.

In the world without security however, passwords could be stored as plain text - simply as they were typed in to the text input. This means that anyone who has access to the database can then scroll down to the appropriate field and read their password. Unfortunately however, if a hacker gains access to this information, they have access to the raw password - that's the password they can use to login to the system. This is not good. So database designers and web developers and so on go a step further and use some kind of algorithm to conceal the password.

How to store sensitive data effectively

When data like a password is put into the database it should be encoded using some kind of algorithm.

The first way of storing passwords is to create or use an encryption algorithm to encode the password and a decryption algorithm to re-obtain the password from the cipher text. This method is uncommon because it means that there is at least one method to decrypt the password in the database, and therefore leaves open a security vulnerability (if someone obtains this decryption algorithm and the key needed to decrypt the passwords, they can simply decrypt every password and it's easy enough to figure the key out if you have a password and it's cipher text). 

The most common algorithm is the SHA because it's been guaranteed to have a one-to-one mapping from the plain text to the cipher text - meaning that no two passwords generate the same cipher text. When this algorithm is applied it is designed to be irreversible, that is it is impossible (or at least near impossible) to figure out what the original text was (at least without going through each combination of characters and testing it against the cipher text). This method is more secure than the former since it does not offer a quick way to take a cipher text and turn it to a plain text. 

These are just two ways of storing passwords but you can probably find other ways. I use a combination of both on my website (my own hashing algorithm and my own encryption algorithm on top).

Posted by jamiebalfour04 in Tech talk
secure
hash
algorithm
storing
passwords
decryption
encryption
md5
sha

Windows 10 was an amazing operating system for a few days when I first installed it on to my gaming PC. My gaming PC, The Zebra X2, is a beast of a machine which can run most games that I play like Starcraft II and GTA V in the highest available settings (Core i7 4770K, 256GB SSD, 8GB DDR3 RAM and an AMD 7950) but latterly it struggled with simple things like starting up.

After I installed Windows 10 the machine ran fine. However, one day when I was playing a favourite game of mine, Command and Conquer 3, I noticed a slight drop in framerate from playing it the time before. I didn't think too much of it at the time but gradually I noticed that each time I played this game it was getting worse. At the very end before I ridded myself of Windows 10 it was running so slow that when I used the graphics intensive Ion Cannon superweapon the game would just freeze and the animation for the superweapon would not be shown. The game would resume after the Ion Cannon blast was finished. So what the heck was going on?

My initial thoughts were that the hard disk drives that I stored my games on were starting to fail. I tested them all with SMART tools and none of them showed any signs of failure. I then assumed that it was my SSD so decided to install an old SATA III HDD into the system and installed Windows 7 on to it. It ran fine. I upgraded it to Windows 10 and again, it ran fine. So I assumed it was the SSD. I left the SSD in the system just disconnected. 

After time, the same weird thing happened to my system - it began slowing and the graphics were getting messed up in games. So now I assumed it was the graphics card or the PCI controller that had failed on me. I took the GPU out of the system and used the dedicated graphics built in to the CPU. The system ran just the same so I now knew it wasn't the graphics card that had failed, but wasn't sure if it was a motherboard fault such as the PCI controller or the memory controller.

I decided to reinstall the SSD and flash my BIOS. Clearing the BIOS meant that I could set it back to the factory defaults and test it with them (I had tried this several times before but to no avail). Nothing changed. 

My next choice was to clear the SSD and install Windows 7 on it. After reinstalling I panicked slightly as it wasn't working well at all with the Desktop Window Manager crashing on startup. After installing Service Pack 1 everything seemed to work perfectly. I would like to say that Windows 7 was the solution but I can't be sure. 

I would probably put the problem down to several things: Windows 10 was clogging up the system (don't know why), the original BIOS was not designed for Windows 10 and would have required an update (I have since updated again and may try it again in the future with Windows 10) and that Windows just needed that little reformat that us Windows users need to do on a regular basis.

My fix appeared to have come from the reinstall of Windows 7 and the BIOS reset. I will keep everyone up to date with my progress with Windows 10 again in the future.

The Windows 10 upgrade tool can be a pain!

Due to the upgrade tool in Windows 7, I have been upgraded to Windows 10. This time the system appears to be running well - that is at least in comparison to how it was before. I will keep you posted when it begins to slow down again (if it does).

Posted by jamiebalfour04 in Tech talk
windows
10
7
8.1
8
later
os
failure
gpu
bios
computer
custom
cpu
ram
sata
USB-C

USB-C seems set to take over now

The future of physical connectivity in computer systems looks very limited. One day in the future I can foresee all devices connecting with a connection not too dissimilar to USB Type C. The reversible USB connector that was released a year back with the new MacBook was received with both positive and negative responses. For me, it was an incredibly positive product since it does a lot of things in one.

Apple just didn't get it right by releasing it with just the one connector. At the moment, adapters are still not everyone's cup of tea. In fact, for most people, adapters never will be a good solution. 

Anyway, the main point of this post is not talking about the MacBook, it's talking about something that I feel strongly about, physical connectivity.

One connector for all...

I don't really like this one connector for all since I've always liked the idea of different connectors for everything. 

You know, I remember when I made the switch to FireWire over USB about 7 years ago, I thought that buying all my drives with FireWire would be great since it's going to be the future of data connectivity. I know I was late in coming in, but I didn't expect it to be removed from all of my devices within a few years! I mean take my Macs for instance, my Retina MacBook Pro does have a Thunderbolt to FireWire adapter available, but this isn't ideal and it's expensive. My Mac Mini does have a FireWire 800 connector on it but the new models also require the adapter since they no longer feature FireWire on them. My PC is just as annoying however, since it doesn't even have a FireWire header on it. My previous PC (the Zebra, built in 2011) featured a Gigabyte GA-Z68X-UD3 motherboard which had 3 FireWire connectors (including headers) and then all of a sudden, an upgrade 2 years later to a Gigabyte GA-Z87-UD3H motherboard and a Haswell architecture, and I've suddenly got no FireWire connectors. 

Actually, one of the most annoying things is all of my PCI cards (not PCI-Express) which I've been using since about 1998 when I got my first PC, including a really old, but still useful, video capture card (circa 2000), no longer work on my system since there are no free PCI slots in my system (I have a serial port card and a TV tuner in them). 

A single connector for all also bring about the concern of overloading buses or whether or not everything is polled (as USB is). Speed can become an issue when one connector is used for everything. 

However, one connector for all is a good thing too, since every device you use will use that connector and it's easy to remember what you need to use the device (i.e. a USB-C cable). But many companies, such as Apple, are difficult and try too hard with their own connectors and make the one for all difficult. Look at the Lightning connector for example, every other smartphone uses Micro USB 2.0 or Micro USB 3.0 meaning you can share your charger with any other smartphone user; that is everyone except iPhone users. It never works. One for all is too difficult.

Complications also arise when you are working with very specific applications. I for one still use the 1980s RS232 standard for many things such as electronic circuit boards for experiments (although I'm looking into using a RPi for this in the future) and for control commands for my projector. With a connector like USB-C, this becomes more complicated since RS232 was a highly simple connector, it becomes harder to emulate old standards. 

Another even more annoying thing is having to buy an adapter to make it work with your older devices such as serial port devices. These devices may be hard to come by, but the bigger issue is if we end up needing all these adapters we've got to pay for them, and more specific adapters will probably be fairly expensive.

Here is a comparison table showing how Thunderbolt has changed over the years:

Version Maximum Speed Maximum Power Output Connector Type
Thunderbolt 10Gbps 10W Mini Displayport
Thunderbolt 2 20Gbps 10W Mini Displayport
Thunderbolt 3 40Gbps 100W USB-C

Notice any similarities between Thunderbolt 3 and USB-C that was announced on the new MacBook? That's because they will likely merge.

The conclusion

Here's my solution: make the one connector for all a reality, just keep the old connectors alongside this new connector, thus giving people, like me, the option to use older connectors without needing to buy the adapter. This keeps costs down, but also leaves users of older devices able to continue to use them. Don't cut out USB Type A and replace all mice and keyboards with USB-C connectors. 

We cannot live in a world where we need to keep all of these dongles for everything, it's simply ineffective and expensive, especially when one breaks down. I believe because we have come from a world of loads of different connectors into a world wanting a single connector for all that we will be faced with many problems. This is particularly the case with industries such as the music industry, where devices of today really lack in the connectivity side of things. I mean the world got rid of the Gameport (or MIDI port) without any major problems, but for those who had purchased MIDI devices that used the Gameport-style connector as the input, they had to go out and buy adapters or get new devices. The change can come on too rapidly, especially for some. The slow but very painful disappearance of FireWire in the last few years has made it's mark, even for me since I can no longer connect my FireWire drives to my PC (I can with my Mac mini thankfully). And now companies are phasing out audio jacks and in particular the multi audio jack system, and all in favour of USB or single audio jack solutions.

Posted by jamiebalfour04 in Tech talk
usb
c
type-c
usb-c
3.0
hdmi
displayport
thunderbolt
pci-e

Tesla, who just so happen to be one of my favourite companies, are unveiling their latest electric car - the Model 3. I am very excited by this mainly from a technology perspective but also from an environmental point of view. 

Teslas current lineup of cars are absolutely stunning, and it may seem like a dream right now, but I am very interested in the range and some time in the future I would love to own one (it will likely have changed by the time I get around to looking at buying one).

Tesla are a fantastic company who build things to an outstanding degree as I noticed when I was in one of the Tesla Stores. They also innovate way more than other car manufacturers, so kudos to them. 

I am excited by the Model 3 however as it will have a lower price point than the other models and it could possibly be the next car my parents buy. The future is electric and I'm hoping the Model 3 proves this further. 

Posted by jamiebalfour04 in Tech talk
tesla
model
3
Powered by DASH 2.0 (beta)
Code previewClose