Industry news

Upgrading Ubuntu 16.04 to 18.04 & PHP 7.0 to 7.2 for WordPress

Helge Klein - Wed, 12/12/2018 - 00:28

This post describes how I upgraded our webserver running WordPress on Apache from Ubuntu 16.04.5 LTS to 18.04.1 LTS. Please see this article for more information on the server’s installation and configuration.

Backup

Before you begin, create a checkpoint (snapshot) in Hyper-V Manager. If anything goes wrong, a checkpoint makes it trivially easy to get back to the last working state.

Installing all Available Updates sudo apt-get update sudo apt-get dist-upgrade sudo apt-get autoremove

Reboot and check Apache’s error log:

sudo shutdown -r now tail /var/log/apache2/error.log Upgrading to Ubuntu 18.04.1 sudo do-release-upgrade

During the upgrade process:

  • When asked whether to install the updated version of /etc/sysctl.conf, select “yes”
  • When asked whether to install the updated version of /etc/apache2/apache2.conf, select “no”
  • When asked whether to install the updated version of /etc/logrotate.d/apache2, select “yes”
  • When asked whether to install the updated version of /etc/ssh/sshd_config, select “keep the local version”
  • When asked whether to install the updated version of security.conf, select “no”
18.04 Upgrade Package Changes Packages no Longer Supported
  • ntp
  • tcpd

You may want to uninstall these packages once the upgrade is finished by running the commands:

sudo apt-get remove ntp sudo apt-get remove tcpd sudo apt-get autoremove Removed Packages
  • curl
  • systemd-shim
  • libapache2-modsecurity
Upgrade PHP Package Name Changes

In the upgrade from Ubuntu 16.04 to 18.04 the PHP version is upgraded from 7.0 to 7.2, which is a good thing. What is not so great is that the names of all the PHP packages change from php7.0-* to php7.2-*. Due to that name change, Apache’s PHP configuration is broken after the upgrade and must be fixed manually.

Additionally, the upgrade routine is not clever enough to upgrade any manually installed PHP packages. The 7.0 versions of the following packages are uninstalled instead of replacing them with their 7.2 versions:

  • php7.0-curl
  • php7.0-gd
  • php7.0-json
  • php7.0-mbstring
  • php7.0-mcrypt
  • php7.0-mysql
  • php7.0-opcache
  • php7.0-tidy
  • php7.0-xml
  • php7.0-xmlrpc
  • php7.0-cli
  • php7.0-common
  • php7.0-readline
Migrating from PHP 7.0 to PHP 7.2 Apache Configuration

Enable the PHP 7.2 module:

sudo a2enmod php7.2 sudo service apache2 restart Installing Missing PHP 7.2 Modules sudo apt-get install php7.2-mysql php7.2-curl php7.2-gd php7.2-json php7.2-mbstring php7.2-opcache php7.2-tidy php7.2-xml php7.2-xmlrpc sudo apt-get autoremove sudo service apache2 restart

Note: mcrypt is not available any more with PHP 7.2.

PHP 7.2 Hardening and Optimization

Edit /etc/php/7.2/apache2/php.ini:

Add the following to disable_functions: exec,system,shell_exec,passthrough

Configure PHP’s opcache by setting:

opcache.enable=1 opcache.memory_consumption=256 opcache.interned_strings_buffer=10 opcache.max_accelerated_files=10000

Restart Apache:

sudo service apache2 restart Fixing PHP Errors PHP Warning “Illegal string offset”

Cause: a string variable is used like an array, e.g.:

$var[index] = "value";

Fix it by adding an array check:

// DATE PHP 7.2 compat: added check if $var actually is an array if (is_array ($var)) $var[index] = "value"; Removing Obsolete PHP Directories

Clean up remainders from earlier migrations:

sudo rm -r /etc/php5 sudo rm -r /etc/php/7.0 Adjusting the Logrotate Configuration

Edit /etc/logrotate.d/apache2 so that it says:

rotate 30 dateext Re-enabling the mod_pagespeed Repository

This was disabled during the upgrade.

sudo rm /etc/apt/sources.list.d/mod-pagespeed.list sudo mv /etc/apt/sources.list.d/mod-pagespeed.list.distUpgrade /etc/apt/sources.list.d/mod-pagespeed.list Checking for errors

Check Apache’s error log:

tail /var/log/apache2/error.log

The post Upgrading Ubuntu 16.04 to 18.04 & PHP 7.0 to 7.2 for WordPress appeared first on Helge Klein.

Microsoft Ends 2018 With Failed Windows Update

Theresa Miller - Tue, 12/11/2018 - 06:30

Many home users of Microsoft products almost take for granted their Windows Update will operate correctly.  For those of you keeping score, Microsoft pulled the Windows Update from October 2018 because it was unexpectedly deleting user files.  To finish the year with a bang, Microsoft pulled yet another failed Windows Update in December of 2018 to end […]

The post Microsoft Ends 2018 With Failed Windows Update appeared first on 24x7ITConnection.

What is apache?

The Iconbar - Fri, 12/07/2018 - 07:35
With RISC OS switching to the Apache license, here is your brief intro to the world of Apache....

Apache the software program
Apache is a key building block of the Internet. It runs on many of the servers which make up the Internet and allows them to provide the websites you use every day. Its many features include the ability to host multiple websites on a sever, control access and provide security, execute scripts and commands when you access pages, log website activity, and a whole host of other features. You use Apache every day without realising it.

Apache the license
All software has a license which defines what rights you have and what use you can make of a piece of software. For example, most commercial software bans you from trying to dissect it and give it away to your friends.

The Apache License is one of several Open Source licenses. These generally come with free software (as in you do not have to pay for it) which includes the source code. The big difference in Open Source licenses is that some are viral (with the GPL you have to release any software which uses it under the same license so it 'infects' the software) and non-viral (you can use it with other software including commercial software so long as you respect the rules on the original software).

It is possible to release software under more than one license. A nice example is the PDF library Itext, which you can use for free under the AGPL license (requiring you to release your code for free as well with the source code), or buy a commercial version (identical except it comes with a commercial license removing this requirement so you can use in commercial software).

If your aim is to encourage maximum update and usage, you would choose a non-viral license such as the Apache license which is what RISC OS now uses.

If you own the software, you can choose to change the license (as RISC OS Developments has done having acquired RISC OS), but you cannot modify the license on software belonging to someone else).

Apache the foundation

There is also an organisation called the Apache Software foundation which provides a home for a large number of software programs developed under the Apache license. Most of these are technical and you might have heard of them if you are a software developer (ie Ant, Groovy, Hadoop, Maven, Perl) or runs on servers providing Internet services (ie SpamAssassin, Tomcat).

Apache is an organisation of individuals (no Company/Corporate membership option) and anyone can join. It also organises conferences and promotes software development.

Anyone can use the Apache license in the software. This is perfectly acceptable and many other software projects have been doing the same for many years.

If you want your software to be an 'official' Apache project, you also need to follow the apache rules on how software is developed. This lays down a clear methodology and governance.

Many of the software projects which are Apache projects started life outside Apache and have joined by adopting the Apache rules. For the last 2 years, the Java IDE NetBeans has been transitioning to an Apache project (I have had a minor involvement in that giving me a very interesting viewpoint of the Apache foundation).

More details on Open Source licenses at GNU website
Apache website

No comments in forum

Categories: RISC OS

PowerShell Script: Test Chrome, Firefox & IE Browser Performance

Helge Klein - Tue, 12/04/2018 - 17:58
Contents

There is more than one way to test the performance of web browsers like Chrome, Firefox, or IE, but regardless of how you do it, you need a consistent workload that makes the browsers comparable. Unless you are testing with synthetic benchmarks (which come with a plethora of problems of their own) you need a way to automate browsers opening tabs and loading URLs. This article presents a simple solution to do just that.

Purpose of This Browser Test Script

I have written about various aspects of browser performance and privacy before. For those earlier articles, I manually ran the browsers through a series of tests. This quickly proved to be tedious and error-prone. Obviously, automation is the name of the game.

For this year’s session Web App Performance in a Virtual World which I presented at Citrix Technology Exchange and at community meetups I went ahead and finally automated a large part of the test process, and it paid off immediately. I was able to test more configurations in less time with increased accuracy.

Testing is not enough, of course. You need to measure, too. For that, I have been using our uberAgent user experience and application performance monitoring product. uberAgent measures browser page load duration for all major browsers (which was important here) in addition to providing detailed application usage and performance insights for all installed and running applications – Win32, UWP, Java, App-V, etc.

Tweet source

What the Browser Test Script Does

It is really quite simple. I needed a script that would do the following – mind you, for any number of installed browsers and for a list of URLs supplied through a parameter file:

  • Start the browser
  • Open each URL in a new browser tab, waiting 30 s in between
  • Close the browser (gracefully)
  • Repeat the above three times per browser
Techniques Used in the Browser Test Script

I only rarely use PowerShell, my main development work is in C++. Nevertheless, you might find some of the following interesting.

Starting Applications With Their Name Only

Windows has a functionality for starting applications by name without requiring them to be part of the PATH environment variable. This is as useful as it is rarely known. I explained the mechanics in my article How the App Paths Registry Key Makes Windows Both Faster and Safer. The thing to note in this context is that App Paths entries can be leveraged from PowerShell with the Start-Process cmdlet. I used it in the script to start browsers by providing simple names like “chrome” or “firefox”.

Closing an Application’s Window Gracefully

When you start an application with the Start-Process cmdlet, it returns a process object. This object has an extremely useful method, CloseMainWindow(). It is equivalent to clicking on the “X” in the window’s upper right corner.

The Browser Test Script # # One time setup: # # - Open all sites on the list in all browsers # - Log on the test user UXMetricsGuyA (where applicable) # - Switch the browser window to full screen # - Config per site: # - Accept cookie popups # - Do not accept a site's notifications # - Enable "stay signed in" where applicable # - Config per browser: # - Configure browser startup to not open previous tabs # - Configure start page: "about:blank" # - Do not save passwords in the browser # - Disable browser dialogs: # - asking about not being the default # - asking if you want to close all tabs # # Before each test run: # # - Empty each browser's cache # - Do not delete cookies # - Close all browsers # - Restart the machine # - Log on as test user test01 # - Start a PowerShell console # - Wait five minutes # - Start this script # # # Global variables # # How long to wait between open site commands $waitBetweenSitesS = 30; # How long to wait after a browser's last site before closing its window $waitBeforeBrowserClose = 30; # How long to wait between browsers $waitBetweenBrowsers = 30; # Name of the file containing the sites to open $siteUrlFile = ".\URLs.txt"; # Number of iterations $iterations = 3; # Browsers to start $browsers = @("chrome", "firefox", "iexplore") # # Start of the script # # Read the sites file $sites = Get-Content $siteUrlFile; # Iterations for ($i = 1; $i -le $iterations; $i++) { Write-Host "Iteration: " $i # Browsers foreach ($browser in $browsers) { # Sites $siteCount = 0; foreach ($site in $sites) { $siteCount++; if ($siteCount -eq 1) { if ($browser -eq "chrome" -or $browser -eq "firefox") { # Start the browser with an empty tab because the first page load is currently not captured by uberAgent $process = Start-Process -PassThru $browser "about:blank" } else { # Start the browser with the first site $process = Start-Process -PassThru $browser $site } # Store the browser's main process (the first one started) $browserProcess = $process; # Wait for the window to open while ($process.MainWindowHandle -eq 0) { Start-Sleep 1 } if ($browser -eq "chrome" -or $browser -eq "firefox") { # Open the first site in a new tab Start-Process $browser $site } } elseif ($browser -eq "iexplore") { # Additional IE tabs need to be opened differently, or new windows will be created instead $navOpenInNewTab = 0x800; # Get running Internet Explorer instances $app = New-Object -ComObject shell.application; # Grab the last opened tab $ie = $app.Windows() | Select-Object -Last 1; # Open the site in a new tab $ie.navigate($site, $navOpenInNewTab); # Release the COM objects Remove-Variable ie; Remove-Variable app; } else { # Addition tabs in Chrome/Firefox Start-Process $browser $site } Start-Sleep $waitBetweenSitesS; } Start-Sleep $waitBeforeBrowserClose; # Close the browser $browserProcess.CloseMainWindow(); $browserProcess = $null; Start-Sleep $waitBetweenBrowsers; } } The Script’s URL Input File

The URL input file I used with the script in my 2018 tests looked like this:

https://mail.google.com/mail/u/0/#inbox https://docs.google.com/document/d/1hOc4bdEQ1-KJ5wOsiCt4kVQB-xaHuciQY6Y4X_I7dYA/edit https://www.google.com/maps/ https://twitter.com/ https://onedrive.live.com/edit.aspx?cid=740de493111072ca&page=view&resid=740DE493111072CA!108&parId=740DE493111072CA!106&app=PowerPoint https://outlook.live.com/mail/inbox https://www.dropbox.com/h https://www.nytimes.com/ https://www.nbcnews.com/ https://edition.cnn.com/

The post PowerShell Script: Test Chrome, Firefox & IE Browser Performance appeared first on Helge Klein.

Gartner I&O Conference 2018: What’s next for IT ops?

Theresa Miller - Tue, 12/04/2018 - 15:18

This week I’m attending the Gartner I&O Conference – or the Gartner IT Infrastructure, Operations & Cloud Strategies Conference 2018. It used to be called the Gartner Datacenter conference, but times have changed. Now the scope of what we must architect has expanded. Is a datacenter just what you manage on premises? Is it your public cloud […]

The post Gartner I&O Conference 2018: What’s next for IT ops? appeared first on 24x7ITConnection.

New - Citrix Gateway (Feature Phase) Plug-ins and Clients for Build 12.1-50.28

Netscaler Gateway downloads - Mon, 12/03/2018 - 22:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - Components for Citrix Gateway 12.1

Netscaler Gateway downloads - Mon, 12/03/2018 - 22:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - Components for Citrix Gateway 12.1

Netscaler Gateway downloads - Mon, 12/03/2018 - 22:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - Citrix Gateway (Feature Phase) 12.1 Build 50.28

Netscaler Gateway downloads - Mon, 12/03/2018 - 22:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

New - Citrix Gateway (Feature Phase) Plug-ins and Clients for Build 12.1-50.28

Netscaler Gateway downloads - Mon, 12/03/2018 - 22:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

November news

The Iconbar - Fri, 11/30/2018 - 07:07
Some things we noticed this month. What did you see?

Elesar finds a way to improve performance on your Titanium machines with free software update.

New stunt drivers game from AmCoG games.

Updated free guide to VirtualRPC in Use available to download.

Review of the new RaspberryPi3 A+ on RISC OS blog.

Elesar update !Prophet to 3.95

No comments in forum

Categories: RISC OS

Download, Install, Import Visual C++ Redistributables with VcRedist

Aaron Parker's stealthpuppy - Wed, 11/28/2018 - 23:48

Note: for a more up to date version of the content in this article, VcRedist now has documentation available here: https://docs.stealthpuppy.com/vcredist

Last year I wrote a PowerShell script that can download, install or import the Visual C++ Redistributables into MDT or ConfigMgr. Long-term maintenance of the full feature set in a single script is a little unwieldy so I’ve re-written the script and created a PowerShell module – VcRedist.

Refactoring the script into a module has been a great little project for creating my first PowerShell function and publishing it to the PowerShell Gallery.

Why VcRedist?

At this point, I’m sure you’re saying to yourself – “Aaron, haven’t you just created Chocolatey?”. In a way yes, this module does exactly what you can do with Chocolatey – install the Visual C++ Redistributables directly to the local machine. Although you can download and install all of the supported (and unsupported) Redistributables, the primary aim of the module is to provide a fast way to download and import the Redistributables into the Microsoft Deployment Toolkit or System Center Configuration Manager for operating system deployments.

Module

The VcRedist module is published to the PowerShell Gallery, which means that it’s simple to install the module and starting importing with a few lines of PowerShell. For example, here’s how you could install the module, download all of the supported Redistributables and import them into an MDT deployment share:

Install-Module -Name VcRedist Import-Module VcRedist $VcList = Get-VcList | Get-VcRedist -Path "C:\Temp\VcRedist" Import-VcMdtApp -VcList $VcList -Path "C:\Temp\VcRedist" -MdtPath "\\server\share\Reference"

This results in each of the Visual C++ Redistributables imported as a separate application with all necessary properties including Version, silent command line, Uninstall Key and 32-bit or 64-bot operating system support.

Visual C++ Redistributables imported into an MDT share with VcRedist

The same approach can be used to import the Redistributables into a ConfigMgr site:

Install-Module VcRedist Import-Module VcRedist $VcList = Get-VcList | Get-VcRedist -Path "C:\Temp\VcRedist" Import-VcCmApp -VcList $VcList -Path "C:\Temp\VcRedist" -CMPath "\\server\share\VcRedist" -SMSSiteCode LAB

Just like MDT, each Redistributable is imported into ConfigMgr; however, Import-VcCmApp copies the Redistributables to a share for distribution and creates and application with a single deployment for each one.

Visual C++ Redistributables imported into ConfigMgr with VcRedist

Of course, the module can download and install the Redistributables to the local machine:

Install-Module VcRedist Import-Module VcRedist $VcList = Get-VcList | Get-VcRedist -Path "C:\Temp\VcRedist" $VcList | Install-VcRedist -Path C:\Temp\VcRedist

By default, this installs all of the supported Redistributables:

Visual C++ Redistributables installed locally with VcRedist

Note that the 2015 and 2017 Redistributables are the same version, so the end result will include only the 2017 versions.

Functions

This module includes the following functions:

Get-VcList

This function reads the Visual C++ Redistributables listed in an internal manifest or an external XML file into an array that can be passed to other VcRedist functions. Running Get-VcList will return the supported list of Visual C++ Redistributables. The function can read an external XML file that defines a custom list of Visual C++ Redistributables.

Export-VcXml

Run Export-VcXml to export the internal Visual C++ Redistributables manifest to an external XML file. Use -Path to define the path to the external XML file that the manifest will be saved to. By default Export-VcXml will export only the supported Visual C++ Redistributables.

Get-VcRedist

To download the Visual C++ Redistributables to a local folder, use Get-VcRedist. This will read the array of Visual C++ Redistributables returned from Get-VcList and download each one to a local folder specified in -Path. Visual C++ Redistributables can be filtered for release and processor architecture.

Install-VcRedist

To install the Visual C++ Redistributables on the local machine, use Install-VcRedist. This function again accepts the array of Visual C++ Redistributables passed from Get-VcList and installs the Visual C++ Redistributables downloaded to a local path with Get-VcRedist. Visual C++ Redistributables can be filtered for release and processor architecture.

Import-VcMdtApp

To install the Visual C++ Redistributables as a part of a reference image or for use with a deployment solution based on the Microsoft Deployment Toolkit, Import-VcMdtApp will import each of the Visual C++ Redistributables as a separate application that includes silent command lines, platform support and the UninstallKey for detecting whether the Visual C++ Redistributable is already installed. Visual C++ Redistributables can be filtered for release and processor architecture.

Each Redistributables will be imported into the deployment share with application properties for a successful deployment.

Import-VcCMApp

To install the Visual C++ Redistributables with System Center Configuration Manager, Import-VcCmApp will import each of the Visual C++ Redistributables as a separate application that includes the application and a single deployment type. Visual C++ Redistributables can be filtered for release and processor architecture.

Tested On

Tested on Windows 10 and Windows Server 2016 with PowerShell 5.1. Install-VcRedist and Import-VcMdtApp require Windows and the MDT Workbench. Get-VcList, Export-VcXml and Get-VcRedist do work on PowerShell Core; however, most testing is completed on Windows PowerShell.

To Do

Right now, I have a few tasks for updating the module, including:

  • Additional testing / Pester tests
  • Add -Bundle to Import-VcMdtApp to create an Application Bundle and simplify installing the Redistributables
  • Documentation updates

For full details and further updates, keep an eye on the repository and test out the module via the PowerShell Gallery.

Image credit:

Alexey Ruban

This article by Aaron Parker, Download, Install, Import Visual C++ Redistributables with VcRedist appeared first on Aaron Parker.

Categories: Community, Virtualisation

Amazon’s Showcase of Innovation at AWS re:Invent 2018

Theresa Miller - Tue, 11/27/2018 - 06:30

The time is upon us for one of the year’s most anticipated technology conferences. Amazon’s re:Invent 2018 has kicked off in Las Vegas and is full of new products and innovations, with more to be announced throughout the week. Now, let’s take a look at what kicks off re:Invent 2018 AWS RoboMaker Makes Robotics Accessible […]

The post Amazon’s Showcase of Innovation at AWS re:Invent 2018 appeared first on 24x7ITConnection.

Enabling HTTP/2 in Apache on Ubuntu 18.04

Helge Klein - Mon, 11/26/2018 - 02:14

A number of requirements must be met before HTTP/2 can be enabled for a website. This is a compilation of steps I went through to get HTTP/2 working on our Apache web server hosting WordPress sites.

HTTP/2 Requirements Requirement #1: HTTPS

HTTP/2 only works with HTTPS. If you have not switched your site to HTTPS, now is the time to do it. You might be interested in my article Switching a WordPress Site From HTTP to HTTPS.

Requirement #2: Apache 2.4.24

The first version of Apache to support HTTP/2 is 2.4.24. If you are on the LTS branch of Ubuntu, this means you need to upgrade to Ubuntu 18.04. I will describe the upgrade process from 16.04 to 18.04 in another blog post.

Requirement #3: PHP FPM

Short version: if you run PHP in Apache via mod_php, you need to switch to FPM. That is not a bad thing. FPM is newer and faster.

Long version: HTTP/2 is not compatible with Apache’s prefork multi-processing module. However, prefork basically seems to be obsolete so it does not hurt to switch to something more modern, i.e., the event MPM. That, in turn, requires a change in the PHP module from mod_php to php7.x-fpm.

Configuration Changes for HTTP/2 Switching Apache’s PHP Module from MPM Prefork to Event

Run the following commands:

sudo apt-get install php7.2-fpm sudo a2enmod proxy_fcgi sudo a2enconf php7.2-fpm sudo a2dismod php7.2 sudo a2dismod mpm_prefork sudo a2enmod mpm_event sudo service apache2 restart Caveat: W3 Total Cache Shows Apache Modules as Not Detected

W3 Total Cache seems to rely on the function apache_get_modules() to detect Apache modules, which does not work with FPM. According to this support article from Plesk, this issue can be ignored.

Installing and Enabling HTTP/2 in Apache

Enable the module mod_http2:

sudo a2enmod http2 sudo service apache2 restart

Enable the HTTP/2 protocol by adding the following to /etc/apache2/apache2.conf:

Protocols h2 http/1.1 How to Verify that HTTP/2 is Working

Cloudflare put together a comprehensive list of ways you can check a website for HTTP/2 support. The easiest to use are probably Chrome Dev Tools (network view, add the Protocol column) or the online test from KeyCDN.

The post Enabling HTTP/2 in Apache on Ubuntu 18.04 appeared first on Helge Klein.

Drag'n'Drop Summer 2018 edition reviewed

The Iconbar - Fri, 11/23/2018 - 08:01

If you want some RISC OS related news, reviews and projects (or still missing the summer days), Then the Summer edition of Drag'n'Drop Magazine is for you.

The magazine is published as a PDF file (so you can read it on any Computer or print it out) and always reminds me of the 1990s era Acorn magazines at their best. The magazine does not assume you have been using RISC OS since the 1990s though, and there is always a beginners page to get you started.

The news section includes information on new hardware, and new commercial and free software. Because can be read online, there are lots of useful website links to follow. There is a games review of Island of the Undead from Amcog Games and an ongoing series on using Schema2.

The bulk of the magazine features code - apps, games, hints and tricks to suit all levels of ability. You can learn about Upscaling and interpolation, type in an play a platform game called Boing, an updated Space Invaders Game, build some more serious applications for desktop and printing. My personal favourite in this edition is the guide to making a glass button.

This month also includes an index at the end for Volume 1-9 of the magazine (which can all be bought in a big back issue edition).

There is a free preview of the magazine to read, and you can buy the magazine directly from the website.

No comments in forum

Categories: RISC OS

How to Limit CPU & RAM via the Windows Boot Configuration

Helge Klein - Wed, 11/21/2018 - 00:40

Testing the effects of different CPU and memory configurations is easiest when you run the tests on a powerful machine and restrict it to the required number of CPU cores and amount of RAM. Microsoft’s documentation of the relevant command is missing an essential parameter. Here are the commands you need.

Limiting the CPU to N Cores

On an elevated command prompt run:

bcdedit /set {current} numproc NUMBER_OF_CORES

Note: strangely, the numproc parameter is missing from the Microsoft documentation of bcdedit. However, it still works fine on Windows 10 1803.

Limiting the RAM to N MB

On an elevated command prompt run:

bcdedit /set {current} removememory MB_TO_REMOVE_FROM_INSTALLED_RAM With: MB_TO_REMOVE_FROM_INSTALLED_RAM = INSTALLED_RAM - DESIRED_RAM

This is unnecessarily complicated. Instead of specifying the total RAM you want Windows to see, you specify how much of the installed RAM to remove (in MB).

Removing a Bcdedit Setting

To remove a setting, run the following on an elevated command prompt:

bcdedit /deletevalue {current} SETTING_NAME E.g.: bcdedit /deletevalue {current} numproc

The post How to Limit CPU & RAM via the Windows Boot Configuration appeared first on Helge Klein.

New - Latest EPA Libraries

Netscaler Gateway downloads - Mon, 11/19/2018 - 18:30
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

PackMan in practice, part 2

The Iconbar - Fri, 11/16/2018 - 09:00
As mentioned at the end of part one, this article about creating PackMan packages is going to look at what's necessary to generate distribution index files, ROOL pointer files, and how these tasks can be automated. Towards the end I'll also be taking a look at some options for automating the uploading of the files to your website.

Index files and pointer files

Distribution index files

Distribution index files are the mechanism by which the PackMan app learns about packages - each 'distribution' (i.e. collection of package zip files hosted on a website) is accompanied by an index file. This index file contains information about each package; at first glance it looks like it's just a copy of the RiscPkg.Control file from within the package, but there are actually a handful of extra fields that are included within each 'control record':

  • Size - The size of the zip file
  • MD5Sum - The md5 of the zip file
  • URL - The (relative) URL to the zip file
Additionally, by convention the distribution index file should only list the most recent version of each package. It's also common (but not necessary) for the package files to contain the package version as part of their name, e.g. SunEd_2.33-1.zip. This way, although the index only lists the most recent version, you can still host the previous versions on your site if you want or need to.ROOL pointer files

For the community distribution that's managed by ROOL, ROOL require package mainteners to provide a 'pointer file' containing a simple list of links/URLs to packages. ROOL's server will then regularly fetch this pointer file, fetch all the (changed) packages, and merge everything into the central distribution index file.What we need is some kind of packaging tool

These rules for distributions, index files, and pointer files mean that the task of preparing package zip files for publishing is too complex to be attempted using simple Obey file or BASIC scripting. OK, if you were only interested in ROOL pointer files you could probably cobble something together without too much effort, but in most cases the use of a 'proper' tool is going to be required.

The good news is that to generate a distribution index file, a tool only needs two bits of information about each package: the URL the package is going to be hosted at, and a copy of the zip file itself. Everything else can be determined by examining the contents of the zip. This means that once you've got such a tool for generating index files, it should be very easy to plug it into your publishing pipeline.

And the even better news is that I've already written a tool - skopt, a.k.a. Some Kind Of Packaging Tool. Currently its feature set is quite small, but since it appears to be the only tool of its kind in the standard PackMan distributions, it's something that's very much needed if the ecosystem is to continue to grow. Additionally, it's built ontop of the same LibPkg library that RiscPkg/PackMan use internally - helping to simplify the code (just a few hundred lines of C++) and ensure consistent behaviour with PackMan itself.Functions

The current version of skopt has three modes of operation:

  • copy - Copy package file(s) from one location to another. However, unlike a simple *Copy command, this will rename the package zip to the 'canonical' form (i.e. SunEd_2.33-1.zip). It'll also raise an error if it tries to overwrite a package which has the same name but different content (which could indicate that you're trying to publish a new version of a package without increasing the version number).
  • gen-binary-index - Scans a list of folders containing (binary) package zip files and generates the corresponding distribution index file. It assumes that the filenames of the packages within the folders will match the names by which the files will be uploaded to your website. However you can specify per-folder URL prefixes.
  • gen-pointer-file - Scans a list of folders containing binary packages and generates a ROOL pointer file. As with gen-binary-index, it's assumed that the names of the files will match the names used once uploaded, and URL prefixes are applied on a per-folder basis.
Uploading packages

With skopt managing generating the index file(s) (and optionally package file naming), there's only one piece of the puzzle left: how can you upload the packages to your site?

Ignoring the obvious choice of doing it manually, there are at least a few RISC OS-friendly solutions for automation:!FTPc

!FTPc can be driven programmatically by the use of Wimp messages, and comes with a handy BASIC library to simplify usage. So if you have FTP/FTPS access to your web site, only a few lines of BASIC are needed to upload files or folders.scp, sftp

If you have SSH access to the web server, then scp (as included in the OpenSSH PackMan package) may be an easy solution for you.

The OpenSSH package also includes a copy of the sftp command, which is useful if your web site supports SFTP rather than the older FTP & FTPS protocols. sftp can easily be driven from scripts by the use of the -b option, which will read a series of commands from a text file.rsync

rsync (also available via PackMan) is another option for copying files via SSH connections (or via the bespoke rsync protocol).Next time

Uploading binary packages to PackMan distributions is all well and good, but what about the source code? Or what if you want to have regular web pages through which people can also download your software? Stay tuned!

No comments in forum

Categories: RISC OS

New - NetScaler Gateway (Maintenance Phase) Plug-ins and Clients for Build 12.0-59.9

Netscaler Gateway downloads - Thu, 11/15/2018 - 22:00
New downloads are available for Citrix Gateway
Categories: Citrix, Commercial, Downloads

Pages

Subscribe to Spellings.net aggregator