Virtualisation

Error message

  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2062 of /home/scslive/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2072 of /home/scslive/public_html/includes/common.inc).

Android Rooting and IoS Jailbreaking – lessons learned for IoT Security

Rachel Berrys Virtually Visual blog - Mon, 08/21/2017 - 11:38

Many security experts regard Android as the wild west of IT. An OS based on Linux developed by Google primarily for the mobile devices but now becoming key to many end points associated with IoT, Automotive, Televisions etc. With over 80% of smartphones running Android and most of the rest using Apple’s iOS, Android is well established and security is a big concern.

Imagine you are a big bank and you want 20000 employees to be able to access your secure network from their own phones (BYOD, Bring Your Own Device) or you want to offer your millions of customers your bank’s branded payment application on their own phone. How do you do it?


Android and iOS have very different security models and very different ways they can be circumvented. Apple with iOS have gone down the root of only allowing verified applications from the Apple store to be installed. If users want to install other applications they can compromise their devices by Jailbreaking their iPhone or similar. Jailbreaking can allow not only the end user to circumvent Apple controls in iOS but also malicious third-parties. IoS implements a locked bootloader to prevent modification of the OS itself or allowing applications root privileges.

Many people describe “rooting” on Android as equivalent to Jailbreaking. It isn’t. Android already allows users to add additional applications (via side-loading). Rooting of an Android devices can allow the OS itself to be modified. This can present a huge security risk as once the OS on which applications has potentially been compromised, an application running on it can’t really establish if the device is secure. Asking pure software on a device “hello, compromised device – are you compromised?” is simply a risky and silly question. Software alone theoretically can never guarantee to detect a device is secure.

There are pure software applications that pertain to establish if a device is compromised usually via techniques such as looking for common apps that can only be installed if a device is rooted/jailbroken, or characteristics left by rooting/jailbreaking applications, or signs of known malicious viruses/worms etc. These often present a rather falsely reassuring picture as they will detect the simplest and majority of compromises so it looks like such applications can detect a potentially unsecure device. However, for the most sophisticated of compromises where the OS itself is compromised the OS can supply such applications with the answer that the device is secure even if it isn’t. Being able to patch and upgrade the OS has a number of technical benefits, so some OEMs ship Android devices rooted and there is a huge ecosystem of rooting kits to enable it to be done. Rootkits can be very sinister and hide themselves though, lurking waiting to be exploited.

Knowing your OS is compromised is a comparable problem to that faced with hypervisors in virtualisation and one that can be solved by relying on hardware security where the hardware below the OS can detect if the OS is compromised. Technologies such as Intel TXT on servers takes a footprint of a hypervisor, locks it away in a hardware unit and compares the hypervisor to the reference on boot ongoing, if the hypervisor is meddled with the administrator is alerted.

Recognising the need for security for Android and other rich OSs, technologies have emerged from OEMs and chip designers that rely on hardware security. Usually these technologies include hardware support for trusted execution, trusted root and isolation with a stack of partners involved to ensure end applications can access the benefits of hardware security.

Typically, there is some isolation where both a trusted and untrusted processors and memory are provided, (some technologies allow the trusted and untrusted “worlds” to be on the same processor). The trusted world is where tested firmware can be kept and it remains a safe haven that knows what the stack above it including the OS should look like. Trusted execution environments (TEE) and Trusted Root are common in cloud and mobile and have enabled the wide-spread adoption of and confidence in mobile pay applications etc.

Many IoT products have been built upon chips designed for mobile phones, thin clients etc. and as such with Linux/Android OSs have the capabilities to support hardware supported security. However, many embedded devices were never designed to “be connected” with such security considerations. For the IoT (Internet of things) to succeed the embedded and OEM ecosystems need to look to hardware based security following the success of the datacentre and mobile in largely solving secure connection.

Of course, it all depends on the quality of execution. Enabling hardware security is a must for a secure platform however if a software stack is then added where a webcams default password is hardcoded the device can be compromised.

Effective Digital Content: Identifying your content top 10!

Rachel Berrys Virtually Visual blog - Mon, 08/14/2017 - 11:47
Make your top content work even harder!

This is a quick and dirty trick common in enterprise marketing and often used by pro-active Product Managers themselves. Most enterprise product marketing and product managers can get access to the google/Wordpress analytics for their products.

It is typical that a small % of the content on any website is attracting the most reads. I’ve recently done some analysis on my own blog site. In this article, I’ll use it as example to explain:

1)      How to analyse your view metrics to deduce your top content

2)      Tell you what trends you may see and what it may mean

3)      Provide a bit of background theory

There are plenty of tools out there to analyse content success that take time to learn and often are quite expensive and all this requires is a bit of excel. It’s something the lone blogger can also use. Keeping the tools simple also makes sure you are getting hands-on familiarity with your content data and the underlying methodologies those tools use.

Most website analytics should provide you with views/reads per page/blog. Personally, I’d advise looking at unique viewers, if you can, rather than page views (a few frequent users of a page can distort the data). I’d also advise filtering out or analysing separately, internal/intranet viewers, especially in a large company (quite often you’ll find your internal marketing team is the biggest consumer of their own marketing!).

WordPress, google analytics and similar should all provide you with some metrics on readership. It’s often not important as to whether the data has flaws, more that the method of counting views is the same for all the pages and has been consistently over the time the data was collected.

How to analyse your data

This may look a bit scary BUT get to grips with it and you’ll have some graphs and data to add to any marketing update. Once you’ve done it once you can produce a reasonable report in less than an hour and with a bit of practice 15 minutes.

1)      I took my blog site views from wordpress for this year in descending order and exported to .csv using the button in wordpress to do so. I then opened the file in excel. I then plotted the column of views. The blog title was in column A and the number of views in column B, starting at B1. Google analytics will allow you to extract similar.

2)      In cell C1 I then added “= B1”; and in cell B2; “= C1+B2”. This will give you cumulative views across the site incremented for each piece of content

3)      I then used the fill down feature and selected the cells from C2 downwards. In this case there were 108 pieces of content so filled down to cells C108 and D108.

4)      The in two spare cells below I entered (=C108*0.5) and = (C108* 0.8). These will give you the number of views that are 50% and 80% of views.

What are we looking for

·         Are your homepages/landing pages in the top 10%? These are the pieces of content from which you have the most control over user journeys around your site.

·         Which are your top 10% or even top 10 (actual number) pieces of content?

·         Which content attracts 50%, 80% of your viewers

Analysing your view data

Take the 50% and 80% view figures from step 5 above and review column C note the indices/rank of the content where column C is nearest to those numbers, in my case 50% and 80% of views were accounted for by my top 7 and top 24 pieces of content respectively.

From the data in column B I plotted the views for each piece of content (blog or webpage), I also changed the colour of the 7th and 24th piece of content on the graph to highlight these key numbers (in red).

 

This pattern is pretty typical of many websites and blogs. A small percentage, often less than 10% will account for 50% of more of your views. And 80% of your views will typically come from around 20% of your material (this is a manifestation of Parento analysis which in turn links to Kipf’s law…. more of that late), it’s amazing how well most content sites fit this pattern.

 

Make your top content work harder

So, a quick bit of excel and maths has left me with the knowledge of which 7 articles of 108 are attracting the most views. Since these are what people are _actually_ reading, the next steps are to check the quality of the experience and improve the user experience. I’ll cover some checklist and quick tricks to do this in future articles.

It’s also worth reviewing what you least successful content is and why. This is the stuff where you “may” have basically wasted your time! Common reasons include:

·         It’s not a topic of interest so a blog may not have been socialised because people didn’t think it was worth sharing!

·         It’s useful and important content but very niche and specific so low numbers of views are fine and to be expected.

·         You have put very good content on a poor vehicle e.g. on an area of a website hard to navigate to or that has been gated (requires a deterring login/email address to be supplied)

·         The content is very new relative to the time over which the data is fine. Everything may be ok you just need to analyse newer content over shorter more recent timeframes.

·         The content isn’t optimised for SEO or well-linked to from your other content.

In my own analysis, I was pleased to see that my home page is the 2nd ranking piece of content. Normally you’d hope and expect landing/home pages to be high up the list as the friendly entry points to your user journey. The article that came top was one that had been syndicated and socialised on reddit so I was comfortable with understanding it’s unusually high readership.

Key things to remember

·         The set of content you analysed is not independent of other content your company or competitors produce. You need to understand what % of your inbound is coming to your blog site say versus your support forums or knowledge base. You also need to understand whether the numbers coming to your site are good/bad versus the general market and competitors.

·         The time period over which you analyse data _really_ matters. Older well-read material scores higher on google. Very recent material has had less time to accumulate views. My blog is more like a website than a blog in that the % of recent new content is fairly low.

·         Marketing tags, if you are a keen user of tagged urls for different campaigns you may need to do some processing on your view data as multiple urls may map to a single piece of content.

·         If you are looking at a large site and/or one with a lot of legacy history, it’s not unusual to have 1000s of pages with very low views. Sometimes it’s better just to discard data for pages below say 10 views.

 

The theory

Many of the newer tools/applications are like black boxes, your average digital marketer uses them without knowledge of the algorithms. When websites were quite new this type of hands-on analysis was more common. Websites traffic statistics often obey Zipf’s law, a statistical pattern that shows up in language (this is also relevant to current Natural Language (NLP/NLU) work and AI). So, a quick theory/history lesson:

·         Back when “The Sun” newspaper website was fairly young (in 1997) some analysis was done that was widely noted. Jakob Nielsen did some work analysing the Zipf fit for “The Sun” website. Nielsen is one of the godfathers of user experience dating back to the 1980s and dawn of the internet (this guy was in Bell and IBM labs at the right time!); founder of the Nielsen Norman Group who still provide futurology and research to enterprise grade marketing.

·         Data Science Central have discussed web site statistics a few times including the Zipf effect, including some of the caveats of traffic analysis; some sites split content to boost page ratings and SEO/bots can throw in data anomalies.

Zipf’s law is widely found in language, connected ecosystems and networking. It’s used to explain City growth and the connected nature of the internet means it’s not too surprising it crops up. Other insightful reads:

·         Why Zipf’s law explains so many big data and physics phenomenons.

·         An old but very interesting read from HP on various areas of the Internet where Zipf’s law pops up.

·         A nice overview from digital strategists parse.ly: Zipf’s Law of the Internet: Explaining Online Behavior (their clients include The Washington Post and many other large media houses).

·         Do Websites Have Increasing Returns? More insight from Neilsen on implications of Zipf.

·         A nice blog from a real Digital Marketing Manager giving an overview on Zipf.

 

So, I also plotted vs rank both on log scales for my blog site. The shape of the graph pleasingly fits the theory (note the linear trendline overlaid in orange).

*Image(s) licensed by Ingram Image

 

Block Windows XP using selective Ciphers on Citrix NetScaler

Henny Louwers Blog - Tue, 05/06/2014 - 09:48
As you probably know Windows XP is no longer being supported by Microsoft. No (security) updates will be made available for Windows XP making it possibly vulnerable for future exploits. As an organization you will have to decide what you are going to do about these (probably unmanaged) Windows XP workplaces. There will still be […]
Categories: Virtualisation

Pages

Subscribe to Spellings.net aggregator - Virtualisation