Kowloon, Causeway Bay, and the worst night show ever – Hong Kong

Today we played Pokemon Go. It sounds like a funny thing to do on holiday, but it’s actually a travel guide! All the Pokestops are in interesting bits of the neighbourhood, and there’s normally a little description of what you’re seeing which is cool.

We wandered through the area by Mong Kok towards Kowloon. This is the more Chinese part of the city, compared to Central on Hong Kong island which is the more commercial British part of the city. The buildings here are functional, not really beautiful, and gritty. We walked past a food market where guys with no shirts on were cutting meat, while ladies with full makeup were selling them. From here we caught the MTR to Central.

Central and Hong Kong station is where the massive IFC malls are. This is a typical high end mall, with all the regular wealthy brands. When I see Chloe or Tom Ford, I don’t really seem them as exclusive. You just need to be rich to wear them. They’re like Hallensteins or Glassons for rich people. We went into a high end store called Lane Crawford. I was wearing shorts, t-shirt, and Birkenstock sandals. The guy at the perfume counter didn’t even say hello to me. I guess shop staff get pretty good at judging people in advance. No surprise, Hong Kong people are very image conscious, so I’d imagine a couple of New Zealanders looking like a summer holiday would feel out of place.

We then went to Causeway Bay and wandered around, including going to a craft beer bar that didn’t have a liquor license! Sure, they served Karma Cola from New Zealand which was nice, though it did cost $10 NZD for the privilege. We continued wandering and saw a lot of pretty average looking apartment complexes, but with some super wealthy cars on the outside, so many Tesla Model S cars! I guess when there’s no land, you invest in cars.

It was getting hot, around 25 degrees Celsius, so we decided to retreat back to Causeway Bay shopping centre, and hit up my only two clothing stores, Muji, and Uniqlo for a Fashion Renewal. I really only shop there now since they’re plain and cheap. I like them, and the quality is good.

After a wee break, we headed back outside to watch the Hong Kong Night Light show called ‘A Symphony of Lights’. Imagine watching 10 buildings for 90 minutes, with some lasers thrown in occasionally. Not recommended. We had been standing around so long and walking so much, that my lower back was killing me. First time that’s really ever happened. So we retreated back to the hotel, and after doing many a yoga stretch, we fell asleep immediately.

Wandering the streets of Macau – Two Minute Travel

We woke up and had breakfast in the Club Sofitel lounge. Really what a beautiful sight. Sitting on the 17th floor of the Sofitel Macau, we had breakfast just as the sun was rising. With it so low in the sky, the sun was a brilliant shade of red, peaking through the haze.

But enough of waxing lyrically, it was time for eating. The breakfast was beautiful, really delicious. Kathryn said that the croissant she had with jam was the best croissant of her life.

My reflection was, this should have been the hotel we went to for our honeymoon. It really was a very romantic breakfast. Anyways, we didn’t just eat all day, we then started our walk around Macau.

Around the Sofitel is the old part of Macau. We wandered following the signposts towards the Ruins of St. Paul. Unfortunately, we missed the last signpost, and then started wandering through a gritty sales area. Thankfully, there are these massive hordes of Chinese tour grounds wandering around, and by following them, we stumbled upon the Ruins of St. Paul.

These ruins are the front of a church that used to be here. They’re pretty impressive. Though I do wonder if they’d be as popular if they weren’t a ruin. I’m just saying there were a lot of impressive buildings in Macau that don’t have a million tourists standing outside, but because they’re not ruins, there’s no one there!

We wandered back down the hill, and then decided to walk towards the Port. Turns out, looks close on the map, but ended up being about an hour and a half walk through the searing heat. Our feet were killing us.

Once back at the port, we caught the free shuttle towards the Venetian. It was kind of funny seeing the boring dusty hot streets we walked through, as we zoomed by in minutes in an air conditioned bus. I guess that’s why no one was walking around right?

The Venetian is a massive thing. A massive mall. A massive casino. Really just everything massive. We didn’t even go to the casino! We did wander the mall, which is about the size of four massive malls glued together by a gondola ride. Kathryn ended up going clothes shopping at Marks & Spencer, while I bought some Scottish sparkling water. However, not everything we did was imported culture – I did try a Macanese Pork Bun. This is literally just a pork chop in bread. $7 later, it was very nice. The spices on the pork chop made it taste nearly like KFC which was quite nice.

Afterwards we got our bags, and thanks to Kathryn’s careful eye, we got out of the line for the bus to the Chinese border, and instead took the bus back to the port, for a hydrofoil bound for Hong Kong. We got off at the Kowloon Ferry Terminal, and then started the long 45 minute trek towards our hotel, the Dorsett Mong Kok. My feet were dying. I could feel them dying. More poor feet. After quite a wander, we made it, made a drink, and went to bed.

Sofitel Macau Review – Spoiler, the best hotel in Macau!

We landed at Hong Kong International Airport at about 3.30pm local time. At HKIA a neat thing is that if you’re catching the high speed ferry to Macau, they depart from the airport, without needing to go through immigration, it’s basically an international transfer!

So that’s what we did. We walked on down to the E2 Transfer Point, and bought a couple of tickets, $254 HKD each. For people in NZ, that works out to be about $50 NZD per person. We had about half an hour before boarding, but there’s not too much to do in the airport except look at the same old duty free shops we’ve all seen before – does anyone want to buy a Toblerone?

The Turbojet ride itself is about an hour long, and pretty uneventful. We did see a bridge being built from Hong Kong to Macau, 50km long. Thankfully Kathryn took her Sea Legs pill before the boat ride, so no complaints there.

At the Macau Ferry terminal, immigration was painless. We waited all of a minute, a guy looked at us, didn’t say anything, and that was that. All the shuttles to the various casinos depart from the ferry terminal. On the right hand side are all the flash casinos, like the Venetian. On the left hand side past some scaffolding are the slightly smaller shuttles, like the one to the Sofitel Macau at 16 Ponte.

After about a 10 minute shuttle ride, we arrived at the hotel. We walked to the check-in counter where we were informed that we were to be checked in at the executive lounge on the 17th floor. Turns out we’d be upgraded to the Luxury room on the 17th floor next to the roof, with views of China just across the river, located in the old part of town. The check in experience was beautiful, with a person doing the paperwork, while we sat down, and had canapes, and drank cocktails. We were so confused – why were they treating us so nicely? But seriously, we felt a bit out of place, we’d just come off the plane, tired, sweaty, and the hotel staff were amazing.

We went to the room, and it was the Luxury Room Club Sofitel. Basically, the room was massive – 37m2, or twice the size of our hotel room in Hong Kong! A massive bath with a spa pillow, Hermes amenities, a TV in the bathroom, super high ceilings, beautiful views over the Pearl River towards China, amazing. The bed was a massive king bed with soft soft pillows, Bose sound system, and automated curtains.

It’s been an amazing trip so far, and we’re so thankful to the Sofitel Macau. Thank you, I will literally recommend this hotel to everyone I met.

Cost wise – we paid $160 NZD for the hotel room, and the Luxury Room costs around $320 NZD a night.

Cathay Pacific Business Class for $1800 NZD – Two Minute Travel

I enjoy puzzles. Recently the puzzle I’ve been playing around with is Airline frequent flyer programmes. If you’re from New Zealand, you’re probably well familiar with Air New Zealand Airpoints. There’s not really much ‘point hacking’ you can do with the programme. One Airpoints Dollar equals one NZ Dollar.

But not all airlines have such simple programmes. My current favourite is Alaska Airline’s Mileage Plus programme. You can buy Alaska Airline miles, and occasionally, they have bonus purchases. The best offer I’ve seen is get 40% more miles. So I ended up purchasing 60,000 Alaska Airline miles for around $1800 NZD.

Next, have a look at redeeming those miles with Alaska Airline partners using their redemption chart. The best value redemption for 60,000 miles is a return flight from Auckland to Hong Kong flying business class. And that’s what we booked!

Now, this is a saving of about $4000 NZD compared to buying a business class flight. But there are some caveats of course. First, availability isn’t very strong. Use the British Airways website to find out when there are seats available on a particular flight. Once available, call Alaska Airlines to do your reservation. Another concern is buying points in advance. Points aren’t cash – airlines can devalue them, cancel partners, change their programme at will. So we waited until there was a good sale, then looked for seats we could buy, then while on the phone to Alaska Airlines reservations, we purchased the points online.

And as for our Cathay Pacific experience? A really nice hard product, i.e. the seat. And an OK soft product, i.e. the people. We flew up Air NZ the day before and they weren’t worried about our carry on bags. But Cathay Pacific wanted to weigh ours! Turns out one bag was 9kg, and we had to remove some stuff from it. Also turns out that Business class passengers are allowed 10kg for carry on. So a mark down there for a lady giving us grief over something she wasn’t correct about.

The lounge access was for the Air New Zealand Koru Lounge. You really can’t complain about it – it’s very nice, the food is fine, and oh my goodness, the showers. SO GOOD. Shower pressure was intense, and there was a rain shower! You can’t beat that.

On board the flight, things were OK. The seat and surround is really good, but the food was a bit average. Ho hum bircher muesli tasted like cornflakes sitting in milk getting soggy for two days. Maybe that’s just what bircher muesli actually tastes like? The sausage and egg omelette tasted generously like that of an airport hotel. The omelette was next to mushrooms, and so the omelette turned grey in colour. The entertainment system is really nice, with live streaming CNN. And there’s WiFi onboard – a reasonable $20 USD for the 10 hour flight, which is a good deal.

Week 12 – Internet of Things and Ubiquitous computing

This was the final week of MSYS559, E-Business Technologies, a Masters level paper from the University of Waikato.

This week we looked at upcoming trends around the Internet of Things, Ubiquitous computing, and how this new paradigm of computing hangs together with all the previous topics.

The To Dos for this week was the look The Internet of Things – A Primer. The key takeaway is now everything will be connected to the Internet. And not just things, but many parts of a thing. A car can have a connection with an entertainment partner to provide audio, the transport authority to understand safety regulations in the area, other nearby cars to sense their location, the car manufacturer to measure car performance, the petrol company to state where’s the next best place to fill up, and what’s available.

The core components of Internet of Things according to this visualisation are:

  • Technology – nearly every physical object having sensors, communicating their states to other entities;
  • Innovation – we’ve never had the ability to monitor sensor information for everything, all the time, in a hyper-connected world. What are the new things we can do with this?
  • Domains – all this information will be mashed together from different data silos or domains, all interconnected to everything, and starting to consider what to do, without human input.
  • Application – things will now share information, be monitored, respond to conditions, and will behave like a vast autonomous system.

The scenario I’m most interested in is energy, smart grids, and smart homes. As every device in the home knows how much energy it consumes, and for what task, they can be optimised to use energy at the correct time, at the best price, for the best purpose. All this automatically without people needing to think about this.

But there are of course, risks. The Internet is a dangerous place, and we haven’t really considered the consequences of what happens when everything is controllable remotely. For instance, take an electric heater. It’s turned off. Someone accidentally drapes some clothes over it or near it. Hackers turn the heater on. A fire happens, and people die. Who’s liable in this case? If you have a smart kettle and hackers turn it on and it boils nothing, or is turned off an on a hundred times a second, what happens to these devices? Who programs the kettle to now define a safe standard operating mode?

The second to do was to read Software Above the Level of a Single Device – The Implications by Tim O’Reilly. This talks about what’s missing with Internet of Things – People. There’s an obsession towards things, what things do, what things know, what things think about. But this doesn’t take into account that things are tools – tools that help people achieve outcomes. And so while it appears that the problems of Internet of Things are technical problems, most of the time, they’re actually people problems, mostly relating to “What is the user trying to do?”.

Users provide inputs, sometimes explicit by interacting with a thing, and sometimes implicit by interacting with the environment. The example provided was the Nest thermostat, which adjusts the temperature depending on what you physically adjust it to, or based on if you’re in the room or not.

These things make decisions based on sensors, but all these sensors are are user interfaces to the thing. If Nest sees I’m not in the room using infrared, that’s still a user interface to Nest. And sometimes these user interfaces are poorly designed because they only think about what the thing is trying to achieve, without taking into the context of the world we’re in, and how the thing fits into that world. The example used was a Tesla car key, which doesn’t have a key ring. It does everything a car key needs to do, but has a poor user interface to the world since people keep losing the car key.

However, the overriding theme of the reading was, don’t just use Internet of Things to solve today’s problems. Think about tomorrow’s problems and solve those. Solve the hard things.

My reflection is, by the time you’ve come up with something to solve the future problem, the future is already here, and hopefully your timing is great.

Week 11 – Mobile Commerce and Location-based Services

This week we looked at technologies for m-commerce, location-based services, and m-commerce.

The review questions for this week are:

Discuss the attributes, benefits, and fundamental drivers of m-commerce.

The attributes of m-commerce are:

  • Ubiquity – m-commerce is everywhere, you don’t have to go to a special place to find it, it is where you are.
  • Convenience – because it’s everywhere, it’s very easy to take advantage of, which makes it the first choice of commerce for a lot of scenarios, like ordering a taxi.
  • Interactivity – the method of purchasing requires you and the system to be involved, for each party to know each other better to provide better service.
  • Personalisation – as you interact with the service more and more, it understands your tastes, and configures itself to emphasise the things you want. Like how amazon.com shows you things based on your history of things you’ve already looked at.
  • Localisation – because m-commerce is everywhere, it must take advantage of your location in order to tailor services that are relevant to you. If the phone knows where you are, then when you order a taxi, you no longer have to tell it where to pick you up.

The benefits of m-commerce are the concrete realisation of the attributes listed above, which are all positive reasons for people to embrace m-commerce. There’s a device in your pocket which can let you transact with the world at any time, compare prices, give you a wealth of information, and help you make decisions fast. Why wouldn’t people take advantage of it?

The fundemental driver of m-commerce is the fact that the world is becoming more mobile, and mobile is the primary computing platform of choice for more and more people. This year tablets will surpass PCs in sales (http://www.extremetech.com/computing/185937-in-2015-tablet-sales-will-finally-surpass-pcs-fulfilling-steve-jobs-post-pc-prophecy). So if people are moving towards mobile platforms, it’s no surprise that m-commerce exists to satisfy that demand.

Discussion m-commerce applications in banking and financial services.

Kiwibank Home Hunter (http://www.sushmobile.com/nz/home-hunter-5/) is an engaging app used to find houses for sale. Potential customers could locate houses for sale on the app, and then at the location, do things like track the sun in the sky to understand how much sun this house is likely to receive. That is only possible in real time by taking advantage of the location of the app. Once a potential customer had decided they liked the house, they could apply for a mortgage pre-approval on the app to understand their borrowing position.

For me, this was amazing to see the engagement the bank could have through one app. No longer did people need to go to multiple websites to find a house, another website to do a mortgage, another website to learn about the area.

Describe consumer and personal applications of m-commerce including entertainment.

The big application these days of m-commerce is mobile gaming. A powerhouse in mobile gaming is King, and their popular mobile game Candy Crush Saga (https://en.wikipedia.org/wiki/Candy_Crush_Saga), which notes that in 2014 $1.33 billion USD was spent on in-app purchases on the game.

Understand the technologies and potential applications of location-based m-commerce.

The technologies involved with location-based m-commerce rely on understanding where the device is. The primary technology to do this is GPS which these days is augmented with GLONASS (https://en.wikipedia.org/wiki/GLONASS). The principle is based on triangulation between different satellites in the sky, the distance of which is calculated by the amount of time the signal has taken to reach the mobile device from the satellite. Because of the reliance on satellites, GPS+GLONASS doesn’t do very well indoors or in covered areas. There are other less fine grained methods of location calculation such as the connection to a particular mobile base station, or a particular Wifi hotspot that has a known location. There are other more fine grained methods of location such as being close to beacons indoor.

Anyways, the point of knowing a location is to then customise the service provided to the customer. A good example would be to highlight food trucks near by. Food trucks are mobile, and therefore aren’t always in the same location. People are mobile, and therefore aren’t always in the same location. If food trucks and people both have location trackers, then each party can determine the best way to get closer together. Food trucks can target locations with a lot of people, and people can find where their favourite food truck is located. A win-win for both parties.

There were also some To-Dos for this week which were:

Comment on the recent TechCrunch article “The Future Of the Web is All About Context“.

The crux of the article is that today, services are personalised, but only within the silos of information they’re aware of. To provide better personalisation to customers, services need to aggregate more information across different data sources. On top of that, semantic processing is required to understand the context of why someone wants to know information. This is all to get towards a holy grail of being able to answer questions like “what are some movies on near me that are similar to other movies I like?”. To do this, a system would need to get a list of movies I like (say from Netflix), look at movie theaters near me (from Google Maps), then look at movies that are playing near by (say from the Theater company), and then mash all the data together to make a reasonable answer.

I don’t think the article addresses the ‘creepy’ factor, which is, as systems get a better aggregated view of people, are people OK with that? Maybe I don’t mind that Netflix knows the movies I like, but I don’t want movie theater companies to know this.

Identify an additional example of disruptive innovation in connection with mobile commerce.

The example of disruptive innovation relating to mobile commerce I selected was E-Bay. E-Bay have created an auction and ‘garage sale’ that is global, but also accessible from mobiles. The idea that at any time people worldwide can list a product to anyone else worldwide, make a decision around whether to let the market determine the value through an auction, or just sell it at a known price with a known margin is game changing.

I do note that they’re not as ubiquitous throughout the world as they would like, for instance, in Australia they dominate the market, with 60% of online shoppers using the site in 2013. But in New Zealand, TradeMe became the more popular auction website mainly due to first mover advantage, as well as localisation for New Zealand.

Week 10 – E-Business Architectures

This week we discussed E-Business Architectures, a topic of personal interest to me as an Enterprise Architect!

The first principle we covered was the divide and conquer approach to IT. Really anything can be broken up from a conceptual idea, into a group of logical units, that are realised at a physical level. Of course, this really describes the high level principles of Enterprise Architecture!

For me, I like the 12+1 view of Enterprise Architecture which is as follows:

  • Contexual – this is the layer that defines business strategies, purpose, outcomes etc. Why an organisation exists.
  • Conceptual – this is the layer that senior managers operate at, and describes the highest level components within an organisation, like business units, organisational data models, and generic technologies like CRMs and ERPs.
  • Logical – a further breakdown into supporting functions. From a business perspective, talking about business processes. From a technology layer, talking about a particular component such as an Application Server.
  • Physical – the realisation of the logical layers. Think of it as the concrete version of the logical layer. So processes are realised as procedures and work files. The logical view of Application Servers gets realised as an installed bit of software on one or more literal servers, deployed to a literal data centre.

The conceptual, logical and physical layers, are sliced into business, information, application, and technology towers, thereby creating a 12 box matrix, with the +1 being the contexual layer.

Anyways, we then moved onto Service Oriented Architecture, the core characteristics of this was a group of services loosely coupled, which can be combined into a complex process. The advantage of SOA is that any particular service could be replaced by another service without materially changing the rest of the business process. These services could be delivered by both internal and external entities. A good example of a service would be an address lookup service. This could be created internally initially based on an internal database. Or it could be a web service provided by NZ Post.

So a business process could be a customer joins an energy company. For this to happen, customers need to be quoted a price and product. But for this to happen their address needs to be provided. So once an address is provided by the customer, this information can be provided to an address verification service, which I view as a technical service. Once this is returned, we can call the quoting service, which in turn can call the product technical service, which itself calls the pricing service. These collection of loosely coupled services are aggregated into business services, i.e. the customer join service.

We talked about some of the technical methods around SOA, like WSDLs to describe web services, SOAP on the formatting of those messages, and UDDI to discover services, but all of that stuff just wasn’t adopted in reality from my perspective, because of the complexity of all of those integration methods. These days, REST+JSON seems to be the easiest way to integrate services, especially for modern web applications.

We then talked about Enterprise Architecture Frameworks, a topic of interest to me! We looked at the Zachman Framework which is an ontology, or a way of categorising things, like servers, and applications, and business functions. Zachman doesn’t describe how to use the ontology, it just provides one. TOGAF (of which I’m certified!) describes a method of doing Enterprise Architecture, but doesn’t really describe outputs. If you add Zachman and TOGAF together, you’ll end up with something that’s practical, which in my opinion is the Integrated Architecture Framework.

Finally, we looked at ITIL, specifically thinking about the impact on SOA management and governance. ITIL specifically talks about the lifecycle of services, such as service strategy, design, transition, operation, and continual service improvement. Though in practice, I’ve only really seen Service Operation in use in organisations, typically Service Desk, Incident Management, Problem Management, sometimes Capacity Management etc.

However, full ITIL could be used to think about the business value of services, who are the stakeholders of the service, how they define value, and how the service (made up of more technical services) realises the business outcomes of the stakeholders. But in practice, I haven’t seen this in my 13 years in IT in New Zealand.

Week 9 – Data Warehouses

This week we covered data warehouses, with a bit of a focus on the relationship with big data. A few questions posed were:

  1. What changes occur in the presence of big, fast, possibly unstructured data?
  2. Is the Data Warehouse architecture still the same?
  3. If not, what needs to be changed or adapted?

In my view, big data is just another data dimension that can be processed with technologies like Hadoop, and then brought into the data warehouse like any other set of data. Using a mechanism like MapReduce to gain insight from masses of data, then allows that insight to be overlayed with other data from transactional systems, as well as external systems, to provide better information to make business decisions.

But is the data warehouse architecture still the same? Yes and no. I think that originally the driver behind having a data warehouse was to be able to run queries against your data, without affecting your transaction system’s performance. But these days, your could run your transactional system on an in-memory database like SAP HANA which runs very quickly. So do you still need a data warehouse? http://www.element61.be/e/resourc-detail.asp?ResourceId=767 argues that you do because:

  1. Data warehouses provide a single version of the truth over aggregates of data coming from multiple data sources, not just transactional systems;
  2. Data warehouses can run data quality processes that wouldn’t be running in the transactional system;
  3. Data warehouses can provide a historical view of information, which may no longer be stored in a transactional system.

Therefore, it’s likely that the architecture of a data warehouse will remain, augmented by in-memory technologies, with big data systems like Hadoop (or HDFS) used as just another data source as an input to the data warehouse. This was reiterated in one of the readings for the week, “Integrating Hadoop into Business Intelligence and Data Warehousing” by Philip Russom, which notes “Users experienced with HDFS consider it a complement to their DW”.

I think an infrastructural trend towards data warehouses is the creation of them in the cloud. Infrastructure in the cloud is very cheap, with products like Amazon Reshift providing Cloud Datawarehouses the can store petabytes of information, without having to purchase expensive hardware.

Another reading was to look at Tableau’s perspective on the Top 10 Trends in Business Intelligence for 2014. As an Enterprise Architect I read these sorts of sales pitches/white papers every day, and I find them to be a bit generic. The list is as follows:

  1. The end of data scientists.
  2. Cloud business intelligence goes mainstream.
  3. Big data finally goes to the sky.
  4. Agile business intelligence extends its lead.
  5. Predictive analytics, once the realm of advanced and specialized systems, will move into the mainstream.
  6. Embedded business intelligence begins to emerge in an attempt to put analytics in the path of everyday business activities.
  7. Storytelling becomes a priority.
  8. Mobile business intelligence becomes the primary experience for leading-edge organizations.
  9. Organizations begin to analyze social data in earnest.
  10. NoSQL is the new Hadoop.

This list really shows the relationship of data warehouses and BI in the broader context of IT, such as Cloud Computing, Agile, and Mobility. So while all the steps make sense, there’s not too many pearls of wisdom. In fact, pointing out that Storytelling is becoming a priority appears pretty self-evident to me, where if the point of BI is to “turn data into insight to make business decisions”, then if decision makers don’t understand the insight put in front of them, they they’ll fail to use that insight to make their decisions, eroding the business value of BI.

Week 8 – Big Data and Hadoop

This week we covered Big Data and Hadoop, a topic of dear interest to me, as I try and understand what to do with all the electricity smart meter data reads we receive as a company. We used to receive one meter read every two months. Now we receive 48 meter reads a day, or 2880 every two months. That’s quite a volume increase, and increasingly we’ll need to rely on big data techniques to process this data.

Which brings me to my first task for this week, which was to look at other potential or existing use cases for big data. As you can see, the increase in electricity meter reads is quite significant. But it’s still not enough. To start to analyse how people consume electricity, we’ll need to move towards minute-by-minute reading, for each device in the household. So in a day, that could be 7200 meter reads, or 432,000 meter reads every two months. As you can imagine, that’s quite a volume increase from one meter read every two months!

The second task for the week was to check out http://www.kdnuggets.com/2015/07/big-data-big-profits-google-nest-lesson.html, which is a Google Nest case study. Google’s Nest is a thermostat for Heating and Air Conditioning systems in the USA. Nest learns the patterns of behaviour for people in terms of the cooling and heating they want, and more efficiently delivers that than existing ‘dumb’ themostats. Nest is more efficient since it can figure out that no one’s home, and reduce heating, therefore saving power, and money. Of course, to do that, it needs to remember and process a lot of data points, which is a related example of big data similar to the smart meter scenario I pointed out earlier.

The third task for the week was to read an IBM White Paper on the Top Five Ways to get started with big data (http://public.dhe.ibm.com/common/ssi/ecm/im/en/imw14710usen/IMW14710USEN.PDF), which are:

  1. Big Data exploration, which is exploring information from sensors, and extracting trends. The company I work for currently does this, by extracting information from Power Station sensors, and doing trend analysis, using software called OSIsoft PI Historian (http://www.automatedresults.com/PI/pi-historian.aspx).
  2. Getting a 360 degree view of the customer, which is something very important to the company I work for. The more information we know about a customer, the more finer grained we can tailor our products and pricing to that customer, which in turn is designed to improve service and reduce churn. Of course, a counterpoint to that is that some people view it as creepy when large organisations collect a large amount of information about customers, and therefore, there is a responsibility to make sure that we do that collection with good intentions, i.e. for the purpose of delivering better products and services. More and more big data needs to be combined with in-memory databases such as SAP HANA (http://hana.sap.com/abouthana.html) to allow us to process data in a timely manner.
  3. Security and intelligence extension, another valuable use case for the company I work for, since the number of cyber attacks against us continues to grow, being able to sort through the logs of hundreds of servers, and thousands of desktops allows us to spot trends, such as malicious attacks running over multiple months. Without big data, we wouldn’t be able to process this amount of logs. Tools like Splunk (http://www.splunk.com/) allow us to analyse this.
  4. Operations analysis, which is the optimisation of our business using sensor data. I’d argue this is a pretty similar use case for us as big data exploration, though i understand one is about exploring new trends, and the other one is about optimising existing patterns in the data.
  5. Data warehouse optimisation, which is particularly important considering the massive increase in data processing (see my original point about smart meter data).

The big implication that I already touched on was the creepiness factor of large organisations knowing more and more information about you. My views is that the mass personalisation of products and pricing just for you delivers better service, though I also understand why some people would want to opt out of this data-utopia. I do think more and more though that’ll become difficult, if not impossible to opt out of. It’s a bit like not using Facebook, sure, you don’t have to, but eventually you’ll never get invited to events because they’re all hosted on Facebook which you’ll never see. So I don’t think all the implications of big data are positive, but then again, all technology has positive and negative consequences.

Finally, we were tasked to think about if big data is the right phrase. Personally, I think it’s just data, rather than big data. There is an explosion of data everywhere, which grows exponentially. Therefore, there won’t be any other processing other that big data.

As a side note, we also went through how MapReduce works. My advice is to check out:

which is an excellent video in describing how MapReduce splits tasks across nodes, then combines the tasks to create a result.

Week 7 – Cloud Computing

This week’s focus was on Cloud Computing, a topic of dear interest to me. The first thing we were tasked to do was discuss which business models appear appropriate for the cloud. In order to do that, we need to look at the NIST (http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf) Definition of Cloud Computing, which notes the following essential characteristics:

  • On-demand self-service
  • Broad network access
  • Resource pooling
  • Rapid elasticity
  • Measured service

In effect, the Cloud Is a great technology platform for businesses that have started at zero, and would like to scale up without incurring the costs of purchasing hardware, or the significant capital investments of a data centre. Start-ups are a great candidate to use Cloud technology platforms. Another suitable business is traditional businesses that require a platform for proof-of-concepts (POCs). Large companies can try new technologies without any consequence on existing infrastructure, and can be shut down just as easily. Another business type suitable for the Cloud is organisations that need to do batch processing of information in a timely manner. Two examples of that is Metservice, who use AWS (http://www.stuff.co.nz/technology/digital-living/8741213/Amazon-ahead-in-the-cloud) to augment their on-shore weather forecasting simulations, as well as Qantas (http://www.itnews.com.au/news/qantascom-begins-transition-to-aws-402996) who use AWS to do flight and weather forecasting.

Next, we looked at two assigned readings, the first being How CloudFlare promises SSL security – without the key (http://arstechnica.com/information-technology/2014/09/in-depth-how-cloudflares-new-web-service-promises-security-without-the-key/). This article discusses how organisations want to use Cloud computing resources, which allow large organisations, like banks, absorb denial of service attacks. However, these entities want to use Cloud computing without handing over the keys to the kingdom so to speak, or in this particular case, the SSL Private Key used to decrypt communications. Therefore, CloudFlare have created a method that allows the Private Keys to remain stored on Customer’s Servers, rather than on the CloudFlare servers. This allows organisations to take advantage of the cloud, while still controlling their own security.

The second reading was on How can we protect our information in the era of cloud computing (http://www.cam.ac.uk/research/news/how-can-we-protect-our-information-in-the-era-of-cloud-computing). The article describes how information can be protected in the cloud by creating multiple copies in a decentralised manner, also known as peer-to-peer. The article goes on to quote Professor Jon Crowcroft saying “We haven’t seen massive take-up of decentralised networks yet, but perhaps that’s just premature”. I’d argue that we do see massive peer-to-peer networks, they’re just being used to distribute movies and other pirated material. As legal authorities moved to shutdown torrent trackers, these then evolved into Magnet Links (https://en.wikipedia.org/wiki/Magnet_URI_scheme) which no longer require a torrent tracker, but instead identify content based on a hash value.

The final task was to look at the pros and cons of New Zealand Government’s Cloud First strategy (https://www.ict.govt.nz/guidance-and-resources/information-management/requirements-for-cloud-computing). The pros below are listed from the previous link:

  • Cloud computing solutions are scalable: agencies can purchase as much or as little resource as they need at any particular time. They pay for what they use.
  • Agencies do not have to make large capital outlays on computing hardware, or pay for the upkeep of that hardware.
  • Cloud computing provides economies of scale through all-of-government volume discounts. This is particularly beneficial for smaller ICT users.
  • Agencies can easily access the latest versions of common software, which deliver improved and robust functionality, and eliminating significant costs associated with version upgrades.
  • If agencies are able to access the same programmes, and up-to-date versions of those programmes, this will improve resiliency and reduce productivity losses caused when applications are incompatible across agencies.

The cons highlighted in the article is that using the Cloud isn’t a free pass to outsource risk, ultimately it’s the agency’s responsibility to use or not use the Cloud. This includes for example, ensuring that data above RESTRICTED isn’t in a public cloud.