Archive

Author Archive

Why Is Kenya Power Dumping Pre-paid Meters?

May 19, 2015 3 comments

meter2Recently, the country’s only power utility company announced that it was slowing down the roll out of the prepaid metering system that they launched about 6 years ago. The reason given for this about turn was that the company is losing revenues as it is now collecting less from the same customers who are now on prepaid metering than they did before when the same group of customers were on post paid metering system.

According to the Kenya Power records, about 925,000 out of the 3.17 Million customers are on prepaid meters. Before the 925k moved to prepaid, they were collecting about four times more than what they currently collect from the same customers. The Kenya Power MD stopped short of accusing customers with prepaid meter tampering as his explanation of the reduced revenues. With the reduction in revenues, Kenya power has decided to classify this reduction as ‘unpaid debts’ in their books. Meter tampering would be across both pre and post paid users if he still holds the opinion that prepaid users are tampering with meters. In fact there are lower chances of a prepaid user tampering with the meter than a post paid user doing the same.

My little accounting knowledge tells me that it is every company’s dream to convert all their customers to prepaid. This shifts the cash flow position to a very favorable one of positive cash flow, you have the money from customers before they consume your service/product. With a prepaid metering system, Kenya power was heading to accounting nirvana but the recent revelations about the accumulating ‘debts’ from prepaid customers was a shock to many. First and foremost, if you do not buy prepaid meter tokens, you cannot consume power on credit and pay later, so how is this reduction in  revenues from prepaid meter consumers classified as a debt as opposed to an outright reduction in collected revenue?

Faulty meters?

There are two main brands of power meters used by Kenya power, Actaris and Conlog. The later brand was found to be defective 3 years into the roll out, the meters were erroneously calculating remaining power tokens especially after a power outage, you could be having say 30Kwh’s remaining on your meter and after a power blackout, the meter reads -30Kwh or some other random negative value. This is what consumers would notice, we cannot for sure say that the same meters also under bill on the same breath. Of course if it under bills, very few consumers would complain or even notice, they would however be quick to notice a negative token value because they would lose power. Could faulty meters be the problem here? Could Kenya power be suffering from substandard meters? Here is a blog link to one affected consumer who complained in 2012 about the faulty meters. Kenya power attempted to replace some Conlog meters but I still see some in the wild in use.

Reality of estimate billing?

We have all been there, where you receive an outrageous bill from Kenya power. This is because more often than not, they estimate power consumed and never get to read the meters in your house. When was the last time you saw a Kenya power meter reader on a motor bike in your estate if you are on postpaid? According to Kenya power books, one post-paid domestic customer consumed 12 Kwh of electricity and on average paid Sh1,432. And each prepaid customer consumed an average 23 Kwh and paid roughly Sh756 to the power company. This can only mean two things:

  • The postpaid customers are over billed due to poor estimation methods as meters are seldom read. I noticed this on my water bill too. When my bill is say 600/= and i overpay 2000/= when settling the 600/= bill, my next bill will be in the regions of 2000/= (estimated from my last payment). So i make sure i pay the exact amount on the bill these days to deny them room to estimate and over bill me.
  • The prepaid meters are spot on accurate. This is the most plausible reason and I will explain below.

Prepaid meters are accurate?

Unlike the old school postpaid meters that measure total ‘apparent’ power consumed, the new prepaid meters assume an efficient electricity grid and measure effective or real power consumed by the customers appliances.  In a situation where the power distribution grid is inefficient, the voltage and current are not in phase. This leads to a lot of ‘wasted’ power. In postpaid, consumers pay for the grid inefficiencies, in prepaid, they do not. This is why there has been a drastic reduction in revenues because consumers are now paying for what they consume and not the wastage on the grid. Perhaps this is what Kenya power sees as ‘consumed but unpaid for power’ by the prepaid meter users? Could be, this is because its not possible to consume more than what you have paid for on a prepaid meter. apparent power is consumed but not measured by the meters. This is especially true if you have appliances with electric motors in them such as washing machines, water pumps and air condition systems.  Read more about power factor by clicking here

You can read older articles on my blog touching on Kenya Power by clicking the links below:

  1. How Kenya can enjoy lower electricity tariffs
  2. Kenya is ripe for a Demand Response Provider
  3. Kenya Power Needs To Be Penalized For Blackouts
  4. There is need to end the Kenya Power monopoly

What Whatsapp voice means for MNO’s

April 1, 2015 5 comments

Facebook inc recently introduced the ability to make voice calls directly on its Whatsapp mobile application. This is currently available on Android OS and soon to be made available on iOS.

What this means is that mobile users with the updated app can now call each other by using available data channels such as Wi-Fi or mobile data. Going by a recent tweet by a user who tried to use the service on Safaricom, the user claims that they made a 7 minute call and consumed just about 5MB’s of data. If these claims are true, then it means that by using Whatsapp, a user can call anyone in the world for less than a shilling a minute. This is lower than most mobile tariffs.

Is this a game changer?

Depends on who you ask. First lets look at what happens when you make a Whatsapp call. When a user initiates a call to another user over Whatsapp, both of them incur data charges, in the case of the twitter user I referred to above who consumed 5MBs, the recipient of the call also consumed a similar amount of data for receiving the call. If it so happens that both callers were on Safaricom, then just about 10MB’s were consumed for the 7 minutes call. The cost of 10MBs is close to what it would cost to make a GSM phone call for the same duration of time anyway. Effectively, to now receive a Whatsapp call, it is going to cost the recipient of the call. This is unlike on GSM where receiving calls is free.  When the phone rings with an incoming Whatsapp call, the first thought that crosses a call recipients mind is if he/she has enough data ‘bundles’ on their phone to pick the call. The danger is if there is none or the data bundle runs out mid-call, the recipient will be billed at out of bundle rate of 4 shillings an MB. Assuming our reference user above called someone whose data had run out, Safaricom will have made 5 Shillings from the 5MBs and 28 shillings from the recipient. A total of 33 shillings for a 7 minute call translating to 4.7 shillings a minute which is more than the GSM tariffs.

This effectively changes the cost model of making calls. the cost is now borne by both parties, something that might not go down well with most users. I have not made a Whatsapp call as my phone is a feature phone but I believe if a “disable calls” option does not exist, Whatsapp will soon introduce it due to pressure from users who do not wish to be called via Whatsapp due to the potential costs of receiving a call. That will kill all the buzz.

Will operators block Whatsapp calls?

It is technically possible to block Whatsapp texts and file transfers using layer 7+ deep packet inspection systems such as those from Allot’s NetEnforcer and Blue coat’s Packeteer. I believe an update to detect Whatsapp voice is in the offing soon and this will give operators the ability to block Whatsapp voice. The question however is what will drive them to block it?  MNO’s will have no problem allowing Whatsapp traffic as it wsill mot likely be a boon for them if most of the calls are on-net (They get to bill both parties in the call). If however most calls are off-net (Like those to recipients on other mobile networks locally or international), then MNO’s might block or give lower QoS priority to make the calls of a poor quality to sustain a conversation. They might however run into problems with the regulator should subscribers raise concerns that they think the operators are unfairly discriminating Whatsapp voice traffic. Net neutrality rules (not sure they are enforceable in Kenya yet) require that all data bits on the internet be treated equally, it should not matter if that bit is carrying Whatsapp voice, bible quotes or adult content. This will mean that operators can be punished for throttling Whatsapp voice traffic in favour of their own voice traffic. This therefore presents a catch 22 situation for them. What they need to do is come up with innovative ways to benefit from this development like offering slightly cheaper data tariffs for on-net Whatsapp voice to spur increased Whatsapp usage within the network (and therefore bill both participants).

Worth noting is that it costs the operator more to transfer a bit on 3G than it does on 4G. Operators who roll out 4G stand to benefit from Whatsapp voice as they can offer data at a lower cost to them and this benefit can be passed down to subscribers. The fact that voLTE is all the rage now, Whatsapp voice can supplement voLTE and can even be a cheaper way for operators to offer their voice services on their LTE networks without further investment in voLTE specific network equipment.

In short any operator who wants to benefit from Whatsapp voice has to go LTE.

Much A Do About Bundles

March 2, 2015 1 comment

data bundlesKenyans (especially the Internet savvy ones) are an angry lot. Angry because a mobile operator has put in place what they term as restrictive terms of use of purchased data plans such as:

  • Expiry of the purchased data plans 30 days after activation
  • Restricted data bundle sharing ability. A user can only share his or her data with other up to a maximum of 10 times in a month down from 50.

Kenyan’s argument is simple; The operator took their money in exchange for the data and therefore the users have a right to use the plans for as long as they please and share as many times to as many people as they wish. This simplistic argument is based on a layman’s understanding of what exactly happens when you purchase a data plan.

When a user buys a data plan, a contract comes into force, this contract is between the buyer and the mobile operator. The contract obliges the operator to deliver the purchased data when and if required by the user. What we need to note however is that the contract comes into force to offer an option, not a product or a subscription.

An Option is defined as “the ability to take a predefined action for a fixed period of time in exchange for a fee. A product on the other hand is defined as tangible form of value. For value to be provided via an option, the seller must:

  • Identify some action people might wants to take in the future (browse the internet)
  • offer potential buyers the right to take that action before a specified deadline (guarantee the connection to download the purchased GBs)
  • Convince the potential buyers that the option  is worth the asking price (Marketing activities)
  • Enforce a specified deadline for taking action. (Data plan expiry)

Options allow the purchaser the ability to take a specific action without requiring the purchaser to take that action. If you buy a movie ticket for example, you have the ability to take a seat in the movie theater but you don’t have to if a more ‘plotious’ plan comes up that’s better than the movie. Being an option, you cannot seek a refund for not having  watched the movie at the advertised times.

Data plans are not a product, they are an option and are therefore bound by time for the specified action to take place. What you purchase is the ability to download xGBs and not the ‘actual’ GBs. This ability is time bound just like your movie ticket. I think the fact that most Kenyans refer them as ‘bundles’ signifies their belief that they have purchased a product.

Some people are arguing that by the fact that money changed hands, the end-user should determine his or her pace of use of the data plan/bundle and there should be no time limit of the usage. What we forget however is that the contract came into place when you purchased the data plan, but ownership was not transferred from the operator because this is not a products but an option. The contract specifies the terms on which the data plan (not bundle) will be delivered to you but it does not transfer any deeds to the end-user. Because options amount to dispositions of future property, in common law countries they are normally subject to the rule against perpetuities and must be exercised within the time limits prescribed by law.

Just like in companies that mostly offer employees share options and not share ownership. Options have limited specified actions and a time limit attached to it as opposed to share ownership.

The best the users can do is to petition the operator to revise the rules governing the options but not pontificate online about what is essentially an offer to take up an option and not buy a product.

When the operator came up with the feature that enabled a user to share or sambaza their purchased data plan to others, what was happening is that users were transferring their purchased option to a different party on commercial basis. The fact that a user could do the transfer many times posed a danger for the operator because:

  • The exchange of money and the option was between the operator and the purchaser. The contract is therefore enforceable between these two. Sharing the data bundle was innocently aimed at fostering data usage but had the inadvertent effect of complicating the options contract. Who should complain if the service is slow/poor? The original purchaser or the shared data recipient? You might argue that the recipient has a SIM card and is therefore in contract with the mobile operator, purchasing a SIM card and activating it constitutes an invitation to treat and no contract comes into force by activating a SIM card.
  • The option rules must have been understood by the recipient for them to accept. The fact that some people had started purchasing wholesale data and retailing it at much lower prices that the operator was doing wasn’t the issue, the issue was the operator found themselves in a legal quagmire as there were now people on the network exercising options they had not purchased. The retailers were purchasing the wholesale bundles as options and selling them as products.
  • An option for a wholesale data bundle has a longer specific action period in which the user can exercise the option. This is assumed to be the consumption of the data bundle in a manner that will deliver the agreed quality of service. A 200GB bundle has a longer expiry period to say a 10MB bundle, this is because based on the network resources, the higher GB bundle can be delivered over a period of time. If you now take the 200GB and ‘sell’ by sambaza-ing 2GB each to 100 people who will then proceed to consume the 200GB within 3-4 days, that voids the contract because the 200GBs were offered at a much cheaper price because there is an element of predictability of the network resources required over a longer period of time in which the 200GB was to be consumed and if these were consumed in a manner inconsistent to the initial agreement which was to ensure that its consumption also enables other users to enjoy their options, the contract is void. Same way you cannot demand a movie in a theater to be fast forwarded  on scenes you don’t like, data options have usage rules, if you make such a demand in a movie theater, the option contract becomes void and you will be asked to leave the movie theater with no refund.

Citations on some legal terms taken from:

  • translegal.com
  • Wikipedia

New ideas needed in the African telecoms scene in 2015

December 31, 2014 2 comments

telcoAs 2014 comes to a close, the continents telecom sector players have had a rather mixed year. Those who were lucky and made a tidy return during the year need to be aware that most of the innovative technology that enabled them return a profit is approaching a point of diminishing returns. if they are to make it through 2015 and beyond, they will need to out-innovate themselves and competition.

In the last Africacom conference held in Cape Town, it was noted by several leading telecoms analysts that telecom operators in Africa (especially Mobile) are confused; unsure if they are banks, insurance firms, hardware vendors, money transfer entities or fixed broadband ISPs. In my opinion this confusion lies in the fact that African operators are close to 100% dependent on vendor driven as opposed to market driven innovation. Noting that there are about 5 major vendors who serve most of African operators (Ericsson, Nokia, Huawei, ALU, Cisco), a lot of copy cat innovations have been shoved down the operators throats. The lack of in-house or external but vendor independent innovation ‘think tanks’ (for lack of a better word) will be their undoing.

Below are some points that I believe any wise telco CEO needs to be aware of in 2015.

Application software (Apps)

For a long time, broadband operators in Africa have been selling bandwidth pipes to connect users to the Internet. With the ‘Appification’ of many services and platforms, browsing via web browsing software is slowly diminishing. The good thing with this is that to some extent the end users cede control of how much is being transferred to the apps leading to higher data consumption spread over a 24 hour period per person. More data use=more revenues. Spread of usage pattern over 24hours = more predictable and stable network.

African operators need to work with content providers in the development of apps which will spur bandwidth consumption and simplify life for users. The burden of app development has been left to mostly young hobbyists in incubation centers and freelance programmers, its time operators took this seriously and worked with developers especially funding their start-ups. Operators such as Safaricom in Kenya and Milicom in TZ have already set-up a venture fund towards this. The effect of this is that these apps will spur a data boom.

Video On Demand

In the past, operators have been cautious over offering VOD services due to several factors such as:

  • Lack of a payment platform due to the very low penetration of credit cards in Africa
  • Unstable networks that would ruin a VOD experience
  • Expensive bandwidth that made it cheaper to lease/buy a DVD movie
  • Lack of VOD ready customer premise equipment

The above barriers are now rapidly vanishing, for example, there might not be a massive uptake of credit cards in Africa, but mobile money platforms have to some extent covered this gap, the other promising feature is the ability to pay for services and downloads from your mobile phone airtime aka Mobile operator billing. The main area that need to be worked on by operators and regulators is the high cost of bandwidth that is still prevalent in many countries in Africa. The telecoms sector is a major source of revenue for many governments by way of spectrum and operating license fees. This cost is passed down to consumers making services expensive. If the governments lowered their appetite for revenues from license and instead let the cheaper bandwidth spur economic gains, the continent stands to gain more. There are over 100 VOD registered operators in Africa and this number is bound to grow if bandwidth was cheaper. With a counterfeit movie DVD going for about $0.5 in Nairobi streets, VOD will take off when the cost of demanding a video online is lower than that, that’s 1.4GB for less than $0.5. The African VOD experience needs not be a carbon copy of the US or EU versions, lower quality videos (hence lower bandwidth consumption) will find a niche here I believe. Remember when people dismissed YouTube by saying who would want to watch grainy videos shot by amateurs from a mobile phone? remember when people dismissed Nollywood saying there is no market for such low-cost, simple plot movies? Low quality VOD could work here in the short-term.

VOD can avail additional revenue streams to operators if done well. It can also backfire on operators if they will not meet the surge in data demand due to VOD. It is one thing to say you offer VOD and it is another to ensure that your network does not collapse due to VOD load. Video will have increased 14-fold between 2013 and 2018. It is estimated that over two-thirds of data on most networks including mobile will be video by 2018. VOD is an opportunity for the prepared and a risk for the unprepared.

Shift from Infrastructure investment to service delivery

Too many operators today are busy investing in and maintaining infrastructure. This is a very outdated way of doing things. We have begun to see a shift in this here where in 2014 we saw Airtel sell its cellphone towers to a third-party and pay to get service from them. This has a two major effects:

  • Infrastructure associated costs now move from the fixed costs to variable cost column of the financial books. This has a great boost to the financial health and makes the company more resilient to market and revenue shocks.
  • Ownership of infrastructure by operators makes them very rigid and fail to adapt to the changing customer needs and make money, sometimes, this change if it happens is not fast enough to meet market demands. I remember working on a project to install a MMS platform for a local MNO, before the service was even officially launched, Whatsapp took the multimedia file exchange scene by storm. The firm had already spent millions. If this was a third-party service instead, they would have spent less or minimized the risk associated with the dismal uptake of MMS services.

Operators need to shift from being technology oriented companies to being service oriented. By service oriented I do not mean becoming a service marketing company by outsourcing everything other than the sales and marketing, I mean their critical business decisions should be informed by meeting customer needs as opposed to deploying the latest, fastest, smoothest or shiniest piece of tech.

Re-look at Value Added Services (VAS) strategies

The ‘VAS or perish’ song has been sung so many times in many a conference I have attended. The problem that is now arising is operators are coming up with what they believe is VAS but is in effect a burden to the consumer. Take for example a certain operator in South Africa who sent me about 4 SMS’s after every call I made on their line about enabling directory services, offer to automatically send my vCard to every person I called, how much airtime my call consumed, an offer for an international bundle whose activation process involved 5 steps and many more. That was outright annoying and took repeated calls to their call center to turn them off. It felt more of value attrition than addition.

That aside, most people relate VAS to mobile operators only, fixed line ISP’s, broadcast and others need to embrace the idea of value addition to their existing services. The tragedy is that many have confused product improvement to value addition, the two are different and can easily be told apart. A fast food restaurant improving the quality of their burgers and fries is product improvement, adding a small toy to all kids meals is value addition. This example therefore means that for value addition to happen, the product must first meet customer expectations otherwise VAS is a waste of time. Many operators use value addition to try to improve the product instead of using it for the purposes of eliciting further delight from the customer (which then creates stickiness). Of what use is the toy in a badly prepared kids meal? In short, if what an operator is calling VAS ends up improving the product as opposed to eliciting customer delight, it’s not VAS. Many operators in Africa are adding toys to burgers with rotten patties. This is why many so-called VAS strategies don’t work because they were simply product improvements disguised as VAS.

Have a happy new 2015!

Ideas For CIOs and IT Managers On Securing Their Networks

November 21, 2014 4 comments

cyber-securityThere has been a lot of talk about increased cases of cyber criminals accessing information stored on computing networks. Many an events organization have also held conference after conference targeting IT managers and CIOs to ostensibly sensitize them on the matter. Many have gladly drawn attendance cheques in favour of these conference organizers for a seat or two where they will go through slide after slide of how to protect their information and data. After the conference, the usual group photo (and many selfies) are taken, not forgetting that one photo where the IT manager or CIO is receiving a certificate of participation from the organizers and their sponsors.

The reality on the ground is that many conference-certificate-waving CIOs still continue to ignore and fail to implement basic measures to protect their networks and information.  Their ignorance however is no defense as cyber criminals continue to seek ways to get into their networks. These criminals try to gain access to networks for two main reasons:

  • To steal information and data from you
  • To use your network as a launch pad for further attacks, this is mostly done by criminals to cover their tracks. A Romanian criminal attacking say a US bank will most likely carry out the attack from an unprotected network in Africa or anywhere else.

I would like to put the issue of cyber security into perspective based on my experience in running large networks for the last 10 years or so.

Why are you a target?

You are a target because you are connected to the public internet, it’s as simple as that. As long as your IP addresses are routed over the public Internet, you will be a target. It’s not because you are a bank, insurance firm, government, Vatican or even a small 2 computer CBO office in Lokichar. You will be attacked for as long as you are online.

How do you tell if you are under attack?

No, when you get attacked, you wont see your computer mouse moving on its own opening files and spewing thousands of lines of code scrolling on your screen like in the movies. It is hard to tell if you are under attack by just sitting on your PC, However if you measure several key parameters on your network, you can know if you are under attack (whether the attack is successful or not is not the issue here). The first thing is your firewalls CPU usage. Many firewalls are low CPU users if configured properly (i am using the term firewall loosely here for now). rarely will a properly sized firewall consume more than 25% CPU. If your firewall is consuming more than that, it is either the wrong firewall size for your network or it is wrongly configured. So if  your CPU usage deviates from the normal by a huge margin, you are under attack. Below is a graph of my firewall CPU when it was busy fighting off a massive attack. As seen, CPU shot to 100% for sometime as cyber criminals initiated a DDoS  on all my /20 and /18 public address space on the Internet. If under ordinary operation my CPU was say 85%, that would leave just 15% to fend off possible attacks and gives a higher probability of an attack being successful because of using a smaller/less powerful firewall

CPU usage on the firewall showing a spike in % CPU cycle usage during an attack.

CPU usage on the firewall showing a spike in % CPU cycle usage during an attack.

The other symptom that you are under attack is an unusually slow network response times. However, network performance should not be used as the only indicator, rather it should be used together with other symptoms. This is because there are many other factors that can cause your network to slow down other than an attack. Firewall software systems reside in memory for faster access by the firewall engine, you will therefore rarely note an increase in memory utilization during an attack. Memory utilization increase in firewalls is mostly due to turning on of additional features on the firewall, for example a firewalls memory utilization increases if you turn on inbound SSL certificate inspection or mail scanning. it is advisable to turn off features you do not use on any device on your network. Also, just because a firewall has a feature you need, it does not mean u have to use it on the firewall device. For example, instead of letting the firewall do email spam scanning, you can turn that off and do it on a dedicated mail scanner Linux box. This action frees up CPU power for network protection.

Next Generation firewalls have inbuilt systems that can warn you if they detect suspicious activity. These warning can be in the form of an email sent to you with details about the attack. A good example is the email below showing attempted tcp scan for any open SSH ports 22 on my network from a criminal in Russia and an ICMP flood attempt by another in China. If the Russian criminal had managed to see some open port 22 on the scanned IP, he would then embark on hacking the device that has that port open, he was however blocked at the firewall and the attempt reported.

A screenshot of an email from a  NextGen firewall detailing attempted attacks on the network

A screenshot of an email from a NextGen firewall detailing attempted attacks on the network

Getting a good system that can prompt you of suspicious activity via email or SMS is highly recommended. You do not want to arrive at work in the morning and find a gory cyber crime scene just because you never got alerted when it all started.

Are all firewalls equal?

Of course not. Many IT admins grew up in Cisco environments and sat for Cisco certifications which they proudly display on their CV’s, they have therefore been conditioned to believe that Anything by Cisco must be the best in the market. That is very far from the truth. From experience, Cisco will offer very good protection up to layer 4 of the OSI model. beyond that (where most attacks occur), its’ performance has been very poor even with their attempt to move from Cisco PIX to the Adaptive Security Appliance (ASA). There are many comparisons online of ASA vs other firewalls like this one here which compared the Cisco ASA and the Fortinet’s Fortigate firewall (Which in my opinion is the best firewall in the world)

Next Generation firewalls have  Intrusion Prevention System (IPS), OSI layer 7 application control with Deep Packet Inspection (DPI). This therefore means the system is both application and content aware. This offers a Unified Threat Management (UTM) system.

Measures to protect your network

There is no one size fits all solution to tackling the ever-increasing attacks on cyberspace. However based on my experience, the following steps are recommended:

  1. Shut down all unused services on your network. For example if you have a Linux server that has Domain Name Service (DNS) running yet you do not use it, stop the  DNS daemon. This lowers the risk of a criminal gaining access to your network, remember that they need to establish a network/Internet socket to gain access. A socket is made up of an IP address and a port. They have the IP, don’t give them the port.
  2. Use non default ports. If you have to use a service within your network, it is advisable to use non-default ports for these services. For example, everyone knows that SSH runs on port 22, that will be the port a cyber criminal will most likely look for. Running SSH on say port 2222 will contribute to an extent to the security of your service incase the criminals manage to gain access past the UTM system. In addition to this, avoid using public DNS for domain name to IP mapping for internal services. But how will users access the services and DNS if they are outside the office network? (see point 4 below)
  3. Control access. Even after changing the ports as per the point above, it is also advisable to set access control rules to the services running on your network. This can be done by use of authentication (username/strong password pair), restricting which IP’s can access the ports via the use of access lists, restricting time of day when the services can be accessed if possible, use management policies such as frequent mandatory password changes. Also, highly recommended is the use of RSA  security tokens in addition to the passwords.
  4. Use of Virtual Private networks (VPNS). if you have users who need to access resources in the office network from outside the office (e.g a traveling salesman), they should do this by use of a Dial-In VPN service. This service should terminate at your UTM device
  5. Use a proven UTM appliance. Do your research before falling for marketing ploys, just because it’s from Cisco, it does not mean its the best. Just because its expensive, it does not mean it can do more/better/faster. Use of “systems that can scale” is a common buzz word in the ICT world mostly applied to having a system that will grow with your use. In the UTM world, a system that can scale is one which other than growing with your needs will also adapt quickly to changing nature of threats. For example, how long did your UTM vendor take to update their IPS signature with the heartbleed vulnerability? a 6 hour delay after the discovery of the threat led to the Canadian Revenue Agency losing taxpayer data.
  6. Enforce Bring Your Own Device (BYOD) policies. One of the easiest ways for criminals to gain access to your network is through the use of compromised systems belonging to your staff. That iPad that your CEO or that smart phone your accountant brings and connects to the office WiFi, is it safe? There are now many BOYD best practice recommendations including the simplest which is having such devices connect to a different and policy controlled VLAN in the office. many free apps that smart phone users download have back doors through which criminals can gain access to your network if the device is connected via WiFi.
  7. Control resource use. By use of policies such as those offered by Microsoft domain controllers, the IT admin can enforce resource use policies such as disable installation of software onto computers by staff. Many pirated software programs habour malware and back doors that can be used by criminals.
  8. Use of Internet Security Software. Also commonly known as Antivirus programs, each node on a network should have an updated Internet security software. These have evolved from being plain Antivirus detectors to security suites that provide protection from phishing, malware and insecure web browsing. The jury is still out on which is the best security software. I would highly recommend Kaspersky end point security software followed by Sophos.
  9. Gain visibility. A survey showed that over 70% ofCIOs have no idea what type of traffic runs on their network. By gaining visibility on what is running on the network and what time,CIOs can lower the risk of an attack. The graph below shows traffic running on a network identified by a device that can do Deep Packet Inspection (DPI). a simple system will classify Facebook traffic as HTTP (because its via port 80 at layer 4), with a DPI device, you can gain insights into exactly what is running on a network and control it. In the example below, because he can now see whats running on the network, a CIO may decide to block Yahoo mail access from the office network if he feels it poses a threat to the network if users will download malware or click on spam links on personal emails from within the office network.

    Protocols

    Graph from an application aware DPI device showing protocols at layer 7

What about encrypted traffic?

With the increase in the use of Secure Socket Layer (SSL) encryption on the open internet after the NSA debacle, many networks are noting a steady rise in encrypted traffic especially HTTPS. Older UTMs are unable to inspect encrypted traffic and this therefore poses a great danger to networks.  A recent report by Gartner Research says that less than 20% of organizations inspect encrypted traffic entering or leaving their networks. You might be wondering if it is possible to inspect SSL encrypted traffic, yes it is possible to decrypt most SSL encrypted traffic and confirm certificate authenticity with the use of a good UTM system. This ensures that only traffic with genuine encryption certificates enters the network.

Frequently Asked Questions part II

October 24, 2014 3 comments

Further to my previous attempt last year to answer some common questions people ask in relation to every day technology, I have come up with a second list of answers to more common questions people ask. I will try to make my answers as simple as possible.

Why are smart phones poor at battery power conservation?

Most of you have experienced this, your newly released iPhone or Samsung galaxy phone has to be recharged every day compared to your feature phone ‘Kabambe’ 2G phone which is charged once a week. You have also noticed that when you turn off Wi-Fi and data services on your high-end phone, the battery lasts longer. Other than the number of apps running on your phone, there is another factor that greatly affects your phone’s battery power consumption rate. This is called signaling. Other than your deliberate use of the phone to make calls and data transfer, the phone also is in constant communication with the base station sending what is known as signaling data. In fact in older poorly designed mobile networks, the signaling data is more than the actual user data. By turning off data on a phone, you greatly reduce the amount of signaling exchanged between your phone and the base station and hence conserving power. With the advent of newer technologies such as 4G/LTE, signaling is more efficiently done and hence better power consumption on 4G/LTE phones. Other than offering faster data rates, these newer technologies are also easy on your battery. There have been attempts by base station equipment manufacturers to lower the amount of signaling from phones but this hasn’t worked very well. In fact one of the biggest attractions of 4G/LTE is not the faster data rates but the lower power requirements and less signalling volume and more efficient signaling techniques they use.

My Internet speed tests do not match my links’ performance

We have all been there, your internet link seems slow and your YouTube videos are buffering but when you perform a speed test your results are spot on at your subscribed plan of say 10Mbps. Well, lets start with the basics. You call it your link ‘to the internet’ but have you ever wondered where  this internet is located? The fact is that most of the content we consume here in Africa is not hosted within the continent. This means that you have to traverse an undersea cable to get your content. For example to access bbc.co.uk from Nairobi, you have to go all the way to Mombasa, take an undersea cable either via south Africa or Suez canal to Europe and into United Kingdom to get to the website. So if it’s a news clip on the bbc website, the video traffic has to travel all the way. However when you perform a speed test, the speed test app on your phone or your browser is deliberately redirected by your ISP to a server within the country. So if you are in Nairobi CBD on a Zuku link, the speed test server is on Mombasa road and if you are on Safaricom then the speed test server is on Waiyaki way. Due to the fact that this is very near, your data transfer rates will be very fast compared to if you did the same tests using a server that is in Europe. On some speed test websites such as http://www.speedtest.net you can manually select a server, try selecting a server in Nairobi and one in Europe or USA and note the difference. So when your ISP sells you 10Mbps, it’s a 10Mbps circuit and not necessarily 10Mbps to the internet. This state of affair is slowly changing as content providers such as Google and content delivery networks such as Akamai are now locally caching traffic. this means that Akamai will keep a copy of frequently accesses content at a server in Nairobi meaning that the trip to get the content in US/EU is cut.

Why did the inventors of blue LED win the Nobel prize and not the inventors of Red and Green LED?

The 2014 Nobel prize in Physics went to three scientists who invented the blue Light Emitting Diode (LED) in the early 1980’s. The red and green LEDs had been invented in the 1950’s and they never won the nobel. The answer is two-fold:

  1. You need red, green and blue LED’s to make white light. It was impossible to make white light out of  only the red and green LEDs, blue was needed.
  2. To make a red or green LED was a straight forward process of sandwiching several crystalline elements together, the problem with making a blue LED in a similar fashion led to the quick destruction of the structure of the LED light causing it to disintegrate immediately due to the elements involved. Their approach involved growing the crystal elements on each other as opposed to taking existing element crystals and physically fusing them together.

Other than providing the ability to now produce white light (red + green + blue = white), The trio went on to turn their blue LEDs into blue lasers, found in Blu-ray players. Because the wavelength of blue light is shorter than that of red LEDs, the beam can be focused to a small spot. This lets you cram more information on to a disc and read it out, giving Blu-rays a better picture quality than regular DVDs.

The white light from LEDs is in use in many areas of your normal life. The screen from which you are reading this article is made from LEDs and the white you see on the screen is made possible buy combining the three primary colour LED lights. LEDs are also finding increasing use in lighting buildings. a LED light uses 75% less energy and lasts 25 times longer than a traditional incandescent bulb.

Why do cellphones no longer spot the protruding antenna?

Advances in antenna engineering have led to the development of Antenna arrays than can work as a single antenna. It is therefore possible to make many tiny antennas on an electronic circuit board and use then as one would do a single antenna. The biggest problem in using several antennas was the increases scattering loss and introduction of noise into the signal. newer coding techniques that worked in noisier channels meant that a signal can still be extracted even in a noisy environment and the use of micro antenna arrays was now possible. This is why modern handsets do not have the protruding antenna ‘finger’.

Why isn’t the more efficient and durable Einstein-Szilard refrigerator in production?

Unlike today’s refrigerators that use a mechanical compressor with moving parts, the Einstein-Szilard refrigerator which was patented in 1930 by Albert Einstein uses an electromagnetic pump with no moving parts, some modifications of the Einstein system also do not use electricity but can use any heat source such as a flame from paraffin or cooking gas.. This design was more efficient, silent and durable and was maintenance free and could last 100 years. The refrigerator patents were bought by Swedish home appliance manufacturer Electrolux ostensibly to stop its mass production. No manufacturer would want to  build something that would never break down and last 100 years. Electrolux is one of the largest compression type refrigerator manufacturer in the world, compression type systems have moving parts and therefore don’t last long.  In your lifetime you will buy 2-3 of the compression type refrigerators as opposed to you inheriting one Einstein-Szilard unit from your parents = no cash for manufacturers.

Why do cars turn?

From a simplistic view, when you turn the wheels of a car by use of a steering wheel, the car should not turn. This is because the two turned wheels will have different centers of circles if they turn by the same angle. This problem means that when wheels are turning, the inside wheel and the outside wheel will trace out circles of different radius and they will just skid in the general inertia direction. This was solved by a German engineer called Georg Lankensperger and was however patented by a a patent agent called Ackermann. This solution was called the  Ackermann steering geometry. This is a geometric arrangement of linkages in the steering of a car or other vehicle designed to solve the problem of wheels on the inside and outside of a turn needing to trace out circles of different radius. This was achieved by making sure all the wheels on the car have the same pivot point (denoted by D) as shown below. As the car moves faster, the point denoted by D moves forward to somewhere closer to the driver’s seat and a small twist in the steering wheel has a big turning effect compared to when the car is slower.

Ackermann geometry showing the different angles whe front wheels assume to enable a vehicle turn.

Ackermann geometry showing the different angles when front wheels assume to enable a vehicle turn. In this diagram, the outer wheel is at 23 degrees while the inner is at 20 degrees viewed from common point D. If both wheels turned by the same degree the car would skid forward.

 Are the three wires I see on power lines live, neutral and earth?

pole_smallWe are all familiar with the power socket outlet where the three holes provide contacts for the live wire, neutral wire and the earth wire. The assumption by many is that the three wires we see on power lines represent the live, neutral and earth. The actual fact is that the three wires on the pole are all live. The earth cable does not leave your premises and is usually connected to a copper rod that is buried in the soil somewhere near your house and sometimes connected to your metal water pipes (ever go a mild electric shock when you touch a tap in the shower and you have an open wound?). The neutral cable leaves your house but is connected to the body of the transformer that is supplying your house. The live wire leaves your house and is interconnected through transformers to the generating units wherever they are located.  Domestic users usually have a single live cable and a neutral cable coming into their premises carrying 240Volts. However heavy and industrial users who use more power have all the three cables plus a neutral cable coming into their premises with each of the three  carrying 250 volts. Because AC electric current is a sine wave, the three 240volt sources are 120 degrees apart and their sum is not 720Volts (240+240+240) but is 415 volts ( 240 x 1.732) We use the value 1.732  because it is the square root of 3 (You can Google why). The reason why a heavy user is asked to use three phase power is because the heavy load is distributed on the three phases as opposed to a single phase. The overall power sucked from the grid is therefore lower because there is less waste heat produced in the three wires compared to if all the power was in one wire in a single phase. Kenya power sets a limit of 2Kw load on single phase. One more point to note is that the three wires you see on poles usually carry 11,000 volts each and this is lowered by the transformer in your neighborhood to 250 volts. The reason why it is transmitted at 11,000 volts is because over long distances it results in less power losses during transmission than if it was transmitted at 240volts. To minimize the losses, it is brought at 11,000 volts to your neighborhood and lowered to 240 volts by the transformer.

Why The Plan To Improve Power Generation Capacity Will Backfire

September 15, 2014 5 comments

powerOne of the promises by the current government to the citizens is improved electric power supply and connection of more homes and businesses to the national grid. This is indeed an excellent plan as a good power supply is an enabler of better living standards.

Kenya’s current power generation capacity stands at 1700MW and the country consumes just about 1400MW on average. The government plans to increase this capacity to 5000MW in the next few years; a 194% increase. Plans are already in full gear to try to meet this promise despite some challenges which I believe are more political than technical.

It is estimated that Kenya has a 10,000MW potential from geothermal power and is grossly under-utilizing this potential at the current installed generation capacity of 209MW. The government formed the Geothermal Development Corporation to champion the harnessing of this resource several years ago. however politics has bedeviled the corporation and to-date, it has not facilitated the generation of even 1MW directly. Progress is however being made towards GDC facilitating the generation of more power from geothermal as it has already engaged reputable drilling and generation companies to do that. The main mandate of GDC is ‘to avail steam to power plant developers to generate electric power’.

We have also seen the controversial award of the tender to set up a coal based power plant in the coast to Centum and its partners. I will not wade into the controversy surrounding the tendering process but this is also one of the key projects the government is taking towards availing 5000MW to the national grid. The tender stipulates certain technical and commercial conditions to be met by the investors and one of them is the provision of generated power to the national grid at a lower price and the use of high calorific value coal.

On Saturday, Dr. David Ndii wrote an article in the Saturday nation that showed that we as a country do not need 5000MW. In his estimates, we need about 2700MW if history is anything to go by. In his article, he showed that as time passed, the power required to produce one unit of the country’s Gross Domestic Product (GDP)  is decreasing and not increasing and that it is a fallacy to imply that increased power output will lead to faster economic development. Power consumption is more of a result of development than its cause. I agree with his sentiments.

Assuming these ambitious projects do actually take off, my biggest concern is the effect of this excess capacity on the consumers. Simple high school economics might tell us that an over-supply of electric power would lead to cheaper per unit cost of power, but this might not be the case, in fact if these ambitious projects succeed, the per unit cost of power might go up due to over-supply. Power generation (and other utility systems) that involve the private sector are tricky and their outcomes might not obey simple laws of economics.

The biggest problem we have with the current arrangement between the government and the independent power producers (IPPs) is the flawed contracts that favor the IPPs and leave the consumer exposed. The contracts are based on the government buying all the power produced by these producers irrespective of whether the government finds use for this power or not. With the current consumption of about 1400MW against a production capacity of 1700MW, the 5000MW the government is promising in the next two years will cause a glut. Unfortunately, this glut will not cause the prices to go down but go up. This is because consumers will have to pay for this  purchased but unconsumed  power.

SGR is not an ideal consumer

There is no ready market in the short-term for this power. There are many people saying the upcoming projects will need this power and they go ahead to quote the electrification of the standard gauge rail (SGR) system which is being touted as one of the key consumers of this power. The other viable consumer is Konza techno city whose ills I’ve discussed elsewhere.

First things first, a train is a very efficient mode of transport, a freight train can carry 1 ton of goods for 300kms on  a litre of diesel this translates to about 16Kwh of equivalent electric power per ton per kilometer, at 18 Shs per Kwh, that’s 288 Shs per ton per km compared to  today’s price of 104 Shs per litre of diesel to do the same work. The current SGR project is designed to run diesel-electric locomotives and converting it to adapt to a pure electric system would cost a lot. In the US, it is estimated to cost about 292Million shillings per kilometer to convert a traditional rail system to an electric system. This being Kenya, the cost per kilometer is bound to be higher especially due to corruption. So if the Govt wants the rail system to be electric, they better build an all-electric system from day one as future conversion will be expensive.

The idle capacity trap and lopsided contracts

In their paper titled “Manufacturer’s response to infrastructure deficiencies  in Nigeria” published by the world bank, Authors Kyu Sik Lee and Alex Anas look at the cost of private infrastructure provision in Nigeria with focus on electric power. They noted that over 75% of private power generation capacity remains idle most of the day and is used briefly during peak power load periods. This is due to the nature of Nigeria (and by extension ) African power usage patterns which can best be described as saw tooth in pattern. The high % of idle capacity results in a very high total average cost of private power generation. This idle capacity is a cost that these private operators are incurring and must be passed to the consumer. The current Kenyan contracts mandate the government to buy the “power produced” and not the “power demanded by the market”. This even makes it worse as consumers will be paying for both the idle capacity and the excess generated power. I wrote about the pitfalls of such lopsided contracts in January 2012  (read the article here) when I said the country need to be wary of the contract it signed with the wind power producer Lake Turkana  Wind Power (LTWP). This contract mandated Kenya to buy power from the wind farm “when they generate”. With wind flow and speed prediction not being an exact science, this meant that should wind blow at whatever time and date and the wind turbines turn, Kenya has to buy that power. This is unreasonable because the wind might start blowing right after Kenya power has just asked another fossil fueled IPP to give it power to meet peak time demand, at that time when the fossil fueled IPP is generating to meet the demand, Kenya power receives a notification that wind is blowing in Turkana and must buy that power even though at that time they do not need it as they cannot ask the fossil fueled IPP to stop generation so randomly. The result is higher power costs as Kenya power will now incur costs to the fossil fuel IPP and the wind power IPP, The wind might also blow when our govt owned Hydro dams are full and Kenya power still has to buy that power. The government later realized this and sought to change the contract, when LTWP company declined, they have been facing hurdle after hurdle in trying  to set up their wind farm. We know the government is a master of hurdle set-ups if they don’t like you.

At the end of the day, the governments plan to produce 5000MW may be well-intentioned but it has been poorly thought out because the contracts are poorly drafted on purpose. The reason I say this is because of endemic corruption on the country. Someone is overlooking these glaring discrepancies in these contracts for their personal benefit, the consumer will end up paying dearly for these corrupt acts of commission. I am by no means saying Kenya does not need this 5000MW of power. My problem is the fact that this power will be available in the next two years creating a glut and the contracts are flawed. What should have happened is a planned and gradual ramp-up of power generation over a span of 10-15 years matching economic development to power demand; as power consumption is a result and not a cause of development, sadly our politicians want this done within their elective term so as to take all the glory at whatever cost or to meet unrealistic election promises.

Follow

Get every new post delivered to your Inbox.

Join 111 other followers