Open Source Software and Chilli Con Carne

I am a big proponent of Open Source Software and all the things it has delivered for individuals, organisations and society. Where would we be today if it wasn’t for GNU, Linux, Apache, MySQL (MariaDB), MongoDB, JavaScript, JQuery, Perl, Python and PHP, not to mention NMIS and Open-AudIT?

 

These and so many earlier Open Source projects were foundational and fundamental to the Internet as it grew and have been the Grand Parents, Uncles and Aunts of the more recent explosion of Open Source projects based around new innovations which would have only been created because of this heritage.

The classic birth of an Open Source project is, “well I really like this (software|language|database) but it does not meet all my requirements, I think I will write one” or “I have this problem and nothing existing really solves this problem the way I need it to, I think I will write one” or even better “this open source (software|language|database) is so good, how could I help to make it better”.

Open Source isn’t all about writing code, people can contribute in all kinds of ways, including testing, documentation, project management, requirements analysis and so much more.

Ultimately for me, Open Source Software is the awesome result of people with diverse backgrounds, skills, experiences and probably most importantly requirements working together to create a solution which embodies the definition of synergy. The result is something which is more generally useful to more people, because of the diversity of this input.

Which brings me to Chilli Con Carne, I love Mexican food, as soon as I first went to Montezuma’s Restaurant in Taringa Queensland as a teenager I have loved Mexican food. From travelling to the USA and then living in California for a while, I learnt about the different types of Mexican, how different Tex-Mex is to Mexican food. More recent trips to Mexico I have learnt how awesome and diverse Mexican cuisine is.

But Chilli Con Carne is not Mexican, it is really Tex-Mex and for me it also brings some of the slow food movement ideas by cooking what you need, using local produce in a traditional way.

I have been cooking Mexican food for years using meal kits and finally, I decided I could do better by doing something myself, so with the help of YouTube and Jamie Oliver, I found a great recipe, which I adapted to what I had and it produced an awesome result.

I was talking to my Opmantek colleagues about it, and they contributed some “code changes” to make it better MarkD suggested smoked paprika instead of paprika, that was an amazing improvement, MarkH sent his Chilli Con Carne recipe and I adopted the brown sugar and chocolate, this added a richness and smoothness to the dish.

Cooking is the ultimate in iterative development, cook, test, taste, improve, repeat. The current iteration of my Chilli Con Carne recipe is included below and it keeps changing and developing as I get new ideas and input from others.

For me, Chilli Con Carne is just like Open Source Software, the product of synergy.

Open Sauce Chilli Con Carne Recipe

I would call this a mild recipe, my kids have eaten this no problem, adding more chilli flakes or using hotter chilli’s would make this as hot as taste prefers.

This batch makes enough to feed 8 with some leftovers, I usually cook a big batch and freeze some convenient meals later.

Ingredients

Mexi Spice Mix

  • 3 teaspoons smoked paprika
  • 3 teaspoons of cumin
  • 2 teaspoons of dried oregano
  • Pinch salt
  • Pinch pepper
  • Lemon zest
  • Juice from lemon

Vegetables Chopped Roughly

  • 2 rough cut onions
  • 1-2 red capsicums (bell peppers)
  • 1-2 yellow capsicum (bell peppers)
  • 1-2 green capsicum (bell peppers)

Chilli’s cut up fine and remove seeds

(Leave the seeds in if you want some more heat)

  • 1 large Poblano chilli
  • OR 2 Aussie green chilli
  • OR your favourite chilli’s

Other things to add

  • 2 tins tomatoes
  • 1/2 tin water, use water from beans
  • Coriander (cilantro)
  • 1 cinnamon stick
  • 2 tins black beans including water
  • 2 tins red kidney beans
  • 1 tablespoon light brown sugar (optional)
  • 60 grams unsweetened baking choc pieces (optional)
  • 4 teaspoons hot chilli flakes

Butcher

  • 1.4kg beef chunks

 

Preparation

Marinate the Meat

Make the Mexi Spice Mix, combine with meat make sure it is really spread through all the meat. Leave to marinate in the fridge for as long as you have time for, overnight is good, an hour or so is ok.

Cooking

If you don’t have time to marinade that is OK, just prepare the same way and straight into the pan.

I cook using a large electric fry pan, which works well and I can leave it cooking overnight if I have time.

The intense part (10-15 mins)

  • Hi heat
  • Braised beef on the stove top
  • If not already marinated add in Mexi spices
  • Add in veggies, then tomatoes and black beans and chillis
  • Break cinnamon stick

The easy part (~60 mins)

If you want the chilli thicker, cook uncovered, if you want it thinner, keep it covered.

  • Reduced to cook for 15mins (level 9 180C)
  • Stir and cook for another 15 mins
  • Reduce heat to simmer and check after 15 mins
  • Reduce heat as needed and check every 15 mins

Extra flavour as needed

While cooking check flavour and add as taste proscribes, but add in small doses, stir through and taste again after 10-15 mins.

  • 1 teaspoon hot chilli flakes
  • 1 teaspoon of cumin
  • 1 teaspoon smoked paprika

The relaxing part (as long as you have time for)

  • Cover the dish
  • Reduce heat to a low simmer, probably the lowest setting you have
  • Leave for as long as you can, 2 hours good, 4 hours better, leaving overnight is awesome
  • Keep an eye on total moisture.

Soupy Tip

If too soupy, scoop off some of the liquid and keep as a soup, you can add beans to it and cook it up a little longer, but so much flavour in that soup.

Serving

Serve as you like, in a bowl, cover in cheese and add some sour cream, accompanied by corn chips is pretty good.

If you prefer a thicker chilli, serve in soft tacos or burrito wraps.

Enjoy.

Uncategorized

Testing Open-AudIT’s Discovery System

I have been asked numerous times to list exactly what Open-AudIT can discover, answering “everything’ causes people to doubt the ability of the product. It is almost as if we need to limit the discussion on the features to make it a believable product. That is not something I would ever like to do. So instead of bringing the product down, I thought I would demonstrate how to test the power of Open-AudIT quickly and easily.

To accomplish this, I imported the Opmantek Virtual Machine into VirtualBox and accessed my install from my browser. (More on that is available here if needed).

Opmantek Virtual Machine Page - 700
From here I clicked the Open-AudIT Community button and by clicking the “Audit this PC” button a shell script is downloaded.
Open-AudIT Login screen - 700

I ran this script from my terminal and it outputted a .xml file that had the audit results. I logged into Open-AudIT (The default username/password is in the blue banner) and was able to import the results of the audit directly.

Open-AudIT Import Device Audit - 700

After this was accomplished I navigated to the device list and can see my computer in the list that may or may not be named after Shaq.

Open-AudIT Audit Results 1 - 700

And a great indication of the software that is discovered from this process as well.

Open-AudIT Software Discovered - 700

That was a really quick way to test the functionality of Open-AudIT with your own machine. The best part of this is you can do this from any machine if you are using VirtualBox or a similar provider. Test this on your device today and you will see the amazing benefits of using this in your organization.

Uncategorized

Introducing opConfig’s Virtual Operator

Introduction

opConfig’s new Virtual Operator can be used to help create jobs comprised of commands sets that can be run on one/many nodes, reporting to see job results and troubleshooting to diagnose nodes, that raise conditions through opConfig’s plug-in system. Quick actions are templates that the virtual operator uses that saves you from having to constantly create commonly run jobs. It also gives operators easy access to run commands on remote systems without giving them full access to the machines.

New Virtual Operator Job

To create a new virtual operator job go to Virtual Operator menu option and click New Virtual Operator Job. You will need to select which nodes you are wanting to run commands on, these are auto-completed from the list of currently activated nodes in opConfig. Next, you can select which command sets should be run on the nodes, this is auto-completed from all command sets which opConfig has loaded. You can also use tags to select which command sets should be run. You can schedule this job to be run now or at a later time, by selecting later this will bring a time-picker to schedule when this job shall be run. A name is auto-generated from data you have already inputted but this can be amended to anything you desire. The details section is a free text field for keeping notes about this job. By clicking schedule this will add the Job to opConfig’s queue and take you to the report schedule.

opConfig New Virtual Operator Job - 700

Virtual Operator Report

A Virtual Operator Report is an aggregation of all data collected from your virtual operator job. On the left panel, you have meta-data about the job, how it was created, by whom and when it’s going to be run or when it was run. The commands panel is a paginated table of the successful commands which were run for the current job. If the command set is using a plug-in to show derived data or report conditions these results are shown inline by clicking the expand icon in the derived column. If the condition has a tag this can be used to help filter down command sets for creating linked virtual operator jobs off these conditions. All operations for the current job are shown to help diagnose connection or command issues that may have occurred.

opConfig Virtual Operator Result - 700

Virtual Operator Troubleshooting

If you have clicked the troubleshoot button from a report condition (see screenshot above for the green button), you are taken to the new virtual operator job screen, but there are a couple of key differences. The node has already been filled out and the command sets have been filtered down using a tag, in this example, we have three command sets with the tag disk. This can help to create workflows where conditions are tagged to limit what the operator can select for the next steps in the troubleshooting process. When this job is created the parent’s job ID is also recorded and the parent’s job name is shown in the newly created report.

opConfig Create Linked Job - 700

Virtual Operator Results & Schedules

There are two final pages that are new, one that shows all scheduled virtual operator jobs and one that shows completed virtual operator jobs. Scheduled shows user-created running jobs and ones which are scheduled in the future. Results show all the completed jobs which were user created and CLI run.

opConfig Virtual Operator Results View - 700

Quick Actions

Quick actions are templates for new virtual operator jobs, we have shipped four sample jobs but you can create your own. Clicking the quick action button will take you to a new virtual operator screen and fill out the specified fields. Create a new json file under
/usr/local/omk/conf/table_schemas/opConfig_action-elements.json
{

“name”: “IOS Hourly Collection”,

“description”: “Hourly baseline collection for Cisco IOS.”,

“command_sets”: [“IOS_DAILY”],

“buttonLabel”: “Collect Now”,

“buttonClass”: “btn-primary”

}

 

Key Datatype About
name string Name which is shown at the top of the quick action element
description string Text shown under the quick action name, useful to describe what the action does
command_sets array of strings Command set keys which you wish to be run
nodes array of strings Names of nodes which you wish the command sets to be run against
buttonLabel string Text of the run button
buttonClass string Css class applied to the button to colour it. btn-default, btn-primary (default), btn-success, bnt-warn, btn-danger

This is the final result of a dashboard that your organization could use today.

opConfig Virtual Operator Dashboard Full - 700
Uncategorized

3 Essential Jobs That Process Automation Will Necessitate

Opmantek have been at the forefront of Network Automation for many years and with each implementation, we continue to see big changes to the way that people work.   As people move from being reactive to proactive, the work that they perform becomes more specialised and analytical.  This got me thinking about the new ways that IT teams are working alongside automated processes and the new job opportunities that are emerging as more and more businesses combine intelligent automation and human interaction to provide smarter and more efficient services to their customers.

Here are a few roles that I anticipate we will see new demand for over the next few years:

Automation Analyst

Alongside data analysts and business analysts, I anticipate a rise in demand for Automation Analysts.  These analysts will specialise in identifying processes that are ripe for automation.  They will undertake ‘process mining’ exercises to identify the approaches and techniques that humans currently undertake to produce outcomes.  They would then analyse these processes to determine best practice and map out the steps to be automated.

Integration Architect

As more tasks are automated and data-driven, IT managers are more likely to start looking at ways they can combine multiple technologies to solve problems and take advantage of business opportunities.  An integration architect would look to solve increasingly complex problems using technologies that are architected together to produce more powerful and efficient outcomes.

Data Quality Analyst

A Data Quality Analyst is responsible for making sure the data generated and moving between devices and applications is fit for purpose, correct and stays that way. The Data Quality Analyst would be responsible for monitoring the chain-of-custody of data as it makes its way between remote locations and cloud-based platforms to ensure that it maintains its integrity for consumption by machine learning and AI applications to make accurate decisions.

These creative and analytical roles will form part of the workforce shift that automation and the 4th Industrial revolution is likely to drive, as we shift from task driven jobs to results-driven jobs and companies change their business models to provide better, faster, augmented services.

If you are an IT Manager looking at implementing a Network Process Automation Project in the near future, we will be releasing a white paper later this month that provides a step by step guide to getting your first project underway.  Contact us to receive early access to the white paper.

Uncategorized

Key differences between phishing attacks and ransomware attacks

The cyber security spotlight has been directed firmly at ransomware in recent times. Yet a recent report in the United Kingdom highlights the fact phishing remains a real headache for businesses, government organisations and not-for-profits. The Cyber Security Breaches Survey 2019, conducted by the Department for Digital Culture, Media and Sport, found nearly one third of businesses (32%) and about one fifth of charities (22%) experienced cyber-security breaches in the previous 12 months. Of these, 80% of businesses and 81% of charities experienced phishing attacks – a considerably higher percentage than those that experienced viruses, spyware or other malware, including ransomware (27% of these businesses and 18% of these charities).

So what are the key differences between phishing attacks and ransomware attacks – and why are phishing attacks a deep concern for businesses? A phishing attack generally involves a malicious person using social engineering techniques to trick a person into supplying sensitive personal or business information, whereas a ransomware attack (that can be delivered through a phishing communication such as an email) aims to extract a ransom from a victim by locking their files and demanding payment for a key to regain access.

Phishing messages often direct victims to fake websites – that may include branding and information copied from legitimate websites to appear authentic to enter their details.

How do you limit the risk to your business – including your people – of being compromised by a phishing attack? The answer is a combination of education, awareness, technologies and processes. The Australian Government’s Stay Smart Online website includes a list of steps your people and your business can take to minimise the risk presented by phishing. These include advising your people to avoid clicking on links or opening attachments in unexpected or suspicious emails and contacting senders to verify concerning emails, using details sourced from a legitimate website or location. Your business should also install and update spam filters and other anti-malware products to help minimise risk.

Firstwave’s Cloud Email Security product provides advanced, feature-rich and configurable cloud email security services for businesses – powered by its cloud email content security and analytics platform technology. If you would like to learn more, please contact sales@firstwavecloud.com.

Uncategorized

Ransomware on the rise in Q1 2019

Businesses beware: ransomware is back and the attacks are more complex and costly than ever. Ransomware campaigns targeting businesses rose in January-March 2019, compared to October-December 2018. Businesses also typically paid out more to the groups behind ransomware to retrieve their files, while infections caused more downtime, on average than during the previous quarter.

These trends  – highlighted in research from a range of vendors – represent a continuation of worrying circumstances identified in a Telstra report released last year. The report found ransomware was on the rise and was increasingly targeted. Nearly one third – 31%  – of Australian respondents whose businesses had been interrupted by a security breach in the past year were experiencing ransomware attacks on a weekly or monthly basis. This was the highest of all countries surveyed.

These findings highlight the importance of vigilance and preparedness in protecting networks and data. This means educating your workforce and working with partners, customers and participants in your supply chain to avoid clicking on suspect email links or attachments. Ransomware groups often incorporate text in these emails that aim to trick people into clicking quickly on these malicious links or attachments.

Other steps your business should take include ensuring anti-malware products are implemented and up to date and taking regular backups that are then stored in isolated locations. Your business should also document the steps to be taken and the responsibilities of individuals and teams in the event of a ransomware infection. These measures will help minimise loss and disruption.

At FirstWave, we provide email and web security solutions featuring advanced malware protection to help businesses avoid falling victim to ransomware and other attacks. For example, our Cloud Email Security product provides advanced, feature-rich and configurable cloud email security services for businesses – powered by our cloud email content security and analytics platform technology. If you would like to learn more, please contact sales@firstwavecloud.com.

Uncategorized