Recent Scams Targeting ASIC Customers

rawpixel-com-369789.jpg

It has come to our attention that scammers pretending to be from ASIC have been contacting registry customers asking them to pay fees and give personal information to renew their business or company name.

These emails most often have a link that provides an invoice with fake payment details or infects your computer with malware if you click the link.

Warning signs the email is not from ASIC

An email is probably a scam and is not from ASIC if it asks you:

  • to make a payment over the phone
  • to make a payment to receive a refund
  • for your credit card or bank details directly by email or phone

Here is an example of a scam email from 5 December

scamemail20171205_498x375.jpg

If the email you received contains the above information, it is not from ASIC. 

How do I protect myself from email scams?

To help protect yourself:

  • keep your anti-virus software up to date
  • be wary of emails that don't address you by name or misspell your details and have unknown attachments
  • don't click any links on a suspicious email

It is also highly advised to check your registration renewal date; ASIC will only issue a renewal notice 30 days before your renewal date. Be sure to search your business name on the ASIC register - if it's outside of your usual renewal time frame it's most likely a scam.

How do I notify ASIC of a potential scam?

If you would like to notify ASIC of a potential scam email, you can forward the entire email to ReportASICEmailFraud@asic.gov.au

To ensure your systems are well protected, get it touch with the Advance team today. We're always looking out for you!

Is Your Disaster Recovery Up To Date?

yosh-ginsu-146166-1.jpg

What should I do with my old hardware?

An all too common trend in the IT industry is to give ex-production hardware a new lease of life running the disaster recovery site. Tight budgets often restrict capital expenditure to areas where real value is visible, and the impacts and results are noticed throughout the organisation. 

These initial savings can be quickly forgotten when an unplanned incident forces the switch over to your disaster recover site. Previous testing may have been successful on the DR equipment during your routine maintenance and test restores, but when a major incident occurs, are you confident that your DR is up to the task?

These are the questions you should ask yourself:

Will the dated hardware run our complete production workload?

How big is the impact on our users?

How long can we operate utilising the DR site before losing business?

How big is the impact on our customers?

It is not unusual for companies to consider that having high-end hardware offsite, doing nothing 98% of the time to be a waste of resources…

The key is to justify the initial expense, leveraging the DR site to provide an additional return on investment. An effective strategy is to live boot a complete clone of the production environment on a separate virtual segment, presenting a fast and accurate test development system.

Utilising Veeam combined with HPE Nimble Secondary Flash Array technology and your favourite hypervisor, you achieve a fast, production ready DR solution. Accompanied with the additional benefit of a fully functional test or development system at your fingertips that can be spun up in minutes.

If you want to learn more about disaster recovery solutions, please contact the team at Advance today.

Minimising a Ransomware Attack

markus-spiske-153537.jpg

What is Ransomware?

Ransomware is a piece of software that has been installed or downloaded to a computer, that once activated it will block access to that computer system until a sum of money has been paid. Typically, the sum of money demanded is not a large amount compared to the cost of time and effort it might take to restore or otherwise resurrect the files.

For example, your work computer containing important documents has been held ‘hostage’ and you are required to pay USD$500 to regain access to your files – when calculating the time and effort required to restore the computer back to the original state, even with good backups, you are likely to exceed that figure.

Two well-known ransomware threats that have received considerable press coverage recently for their widespread nature are the WannaCry and Petya attacks. These aren’t the only Ransomware threats out there, there are hundreds and they won’t stop circulating.

How do I minimise my risk of getting ransomware or having to pay for my files to be decrypted?

This is truly a case of being vigilant and taking precautions so as not to be caught out and taken advantage of by a Ransomware attacker.

On your computer

Make sure important data is not only stored on the computer! Backing up important files to an external hard drive (not attached permanently to the computer) is a good idea. It is important to note that cloud backups with an automatic sync (such as DropBox, Google Drive, OneDrive etc) may also be infected due to the infected files syncing. It poses the question; do you always need to have these turned on by default?

imgix-391808.jpg

Ensure that your operating system and antivirus is up to date (including latest security updates and virus definitions) and that you use some form of ad-block to avoid the threat of malicious ads. To go even further, refrain from using an administrative account on your computer and disable macros in Office products by default.

Keep your browsers updated and remove outdated plugins and add-ons from your browsers. Remove Adobe Flash, Adobe Reader, Java and Silverlight from your browser plugins - if they are needed then set the browser to prompt for activation when these plugins are required to run.

General Behaviour

Learn the typical signs of a spam message and don’t open any suspected spam message from an unknown sender.

Be very cautious of any attachment within an email that you are not expecting. Sometimes a contact could be caught out and a virus distributed from their email account, which may look totally innocent. If in doubt you can ask the user whether they intentionally sent the attachment to you, over the phone or IM.

Be extra cautious of all links in emails, as links can be made to look valid but take you to malicious sites instead.

Conclusion

The best form of protection against a virus or ransomware is prevention. By changing your mindset around emails, links, attachments and computer updates you can drastically increase your chances of avoiding these threats. Stay vigilant!

For more information on minimising a ransomware attack in your business, speak to a member of the Advance team today!

Security Considerations When Employees Leave

jonathan-velasquez-3840182028129.jpg

With an increase in the use of external websites which store data, personal mobile devices being used for work and the rising trend of employees performing their duties outside the traditional workplace model – you need to ask yourself, are you doing enough to ensure the security and confidentiality of yours and your customer’s information?

jason-blackeye-198848.jpg

When an employee leaves a business, it is imperative that a process is followed to de-provision access to systems they may have used. Here a problem arises – it is likely that the Company has not kept sufficient records of what information the now ex-employee could access, and as such will likely miss one or more areas that the employee can access.

As an example, have a look at some access rights that an employee may begin with and gain over their tenure with your business:


  • Internet Access
  • Internal WiFi Access
  • Domain Access
  • Security/Alarm access codes
  • Website Passwords
  • Social Media Passwords
  • Credit Card Details
  • Car Keys
  • WiFi access
  • Stored login information on personal devices
  • Cloud Account login information
  • USB backups held offsite by that employee
  • VPN Details to connect to the internal server
  • Knowledge of other employee’s usernames and passwords


More information given in confidence to an employee results in more work that needs to be done to remove that employee, leaving the whole termination process liable to human error. It is vital to ensure that employee access to systems and data is de-provisioned completely and on-time to protect your business.

Simple Steps: Begin with provisioning and recording

Once a decision has been made to hire an employee for a certain role; access rights, hardware requirements and external access should be determined prior to their start date. This information needs to be recorded consistently, and an approval process needs to be in place for any security related process or device.

Using a hardware or software solution, you should enable enough security to prevent users from using their own file sync solutions (e.g DropBox, Box etc). The same applies with USB devices, implement hardware or software restrictions to ensure that USB’s can only be used with the right approval.

If users have private work information or data on a mobile phone, implement a device management system that supports the remote wiping of data on mobile devices – this includes tablets. An extra measure would be to encrypt laptops and hard drives to ensure that no sensitive information is lost when a device is lost or misplaced.

Simple Steps: Employee leaving

Once an end-date has been determined for an employee, they should be put into a process to have their rights and access removed – starting with a review of your documentation on their current access. Once their end date is reached, the removal should begin almost as soon as they are out the door.

Retrieve any hardware and mobile devices that belong to the business, change passwords for accounts that didn’t have unique logins for each user (e.g social media), remove the users security access to the building (change the pin code if necessary) and if the office WiFi uses a single password, have this changed. If the employee had a credit card, ensure it is cancelled completely and they are removed from the account.

Simple Steps full stop

To reduce the impact of an employee’s departure, it is beneficial to implement policies and access methods that reduce the need for hands-on changes which can affect other staff (password reset’s, access code changes, etc)

Our tips:

  • Ensure that each user has their own personal login where possible, including domain access, systems that are used and websites.
  • Ensure that important financial information is never given out to employees. If they do have a credit card, it should be on the business account but under their details, with its own limit.
  • Limit access to USB ports and other ports that can transfer information, ensuring that employees do not have installation rights.
  • All employees to understand the importance of not sharing usernames and passwords
  • Rather than using a WiFi password to authenticate wireless users, this should be done by MAC address with approval, keeping record of who devices belong to.
  • Do not give any employee access to social media sites. This should be controlled by one person only and when that person leaves then all passwords should be changed immediately.

rawpixel-com-310778 (1).jpg

What can’t be helped

Even with the best security and processes in place, there will always be ways that your security could be compromised. However, with effective internal processes, good documentation, follow ups and reviews of your procedures you can drastically minimise the effect of an employee leaving.

Why not start looking at your systems now?

For more information on strengthening your IT security please contact us.

Are Your Business Processes a Target For Scammers?

andrew-neel-117763.jpg

Cyber criminals are tricking CEOs out of millions of dollars by exploiting their organisations poor business processes and fooling unsuspecting employees into transferring money. The growing trend, known as ‘CEO Whaling’, involves plain text e-mails being sent to employees’ responsible financial transactions, masquerading as their boss requesting them to urgently pay invoices. Those falling victim have no way to recover the money with insurance generally not covering international fraud.

These highly organised con artists are not just spamming companies at random, instead they’re using social media to research potential victims, taking advantage when they’re most vulnerable. For example they may identify through social media that the boss or the person responsible for financial transfers is on a holiday and that’s when they strike, sending an e-mail saying they’re about to get on a flight and need an invoice paid urgently. They use a fake e-mail address and include some personal details uncovered via social media to give the e-mail just enough validity to trick the employee into believing it needs to be done and that requesting confirmation will probably make their boss angry due to the delay caused by being on a flight and unable to respond.

facebook_20110509184953_640_480_20110929102731_640_480.jpg

Organisations with business processes that rely on an e-mail from the boss for financial approvals are at high risk of falling victim to this scam as the process doesn’t include any validation that the invoice hasn’t been modified or that the approval has come from the person with authority to approving it. Busy people find the use of e-mail in a process like this convenient as they can be sent at will from virtually anywhere, on any device, at any time, putting them at risk of being exploited. Processes that involve printing, stamping, signing and shuffling paper around for approval stall when the approver is not in the same location as the document. Allowing e-mails to be used in place of an actual signature on the document makes the process susceptible to scammers. This issue was recently reported on in The Advertiser, read that article here http://www.adelaidenow.com.au/technology/how-australian-bosses-are-being-tricked-out-of-millions-of-dollars-by-cyber-criminals/news-story/57318e06c02a8215b8d67d521a219aea.

The solution to avoid being tricked by the scammers is to implement a flexible solution like M-Files where the business process is migrated into the system with secure access provided via desktop, web and mobile app. M-Files stores a single electronic version of the invoice with security that restricts access to only the people involved. This avoids copies of the invoice being e-mailed, instead those involved all refer to the same version stored in M-Files. With the approval process managed via workflow, the approver is notified of an invoice to approve and is required to authenticate themselves to view and approve, which can be done quickly a simply via the mobile app using fingerprint authentication. The people responsible for payment are then notified and required to authenticate to access the approved invoice. M-Files keeps a detailed version history of every change the document goes through, so if the person responsible for payment wants validation that the boss approved the invoice for payment, they can review the documents history to confirm it was actually approved by the boss’s user account. The version history can be used to identify changes to the original document and can potentially identify fraud attempts where bank details for payment have been changed on an invoice. Aside from not falling victim to fraud, the benefits of keeping the records electronically rather than physically include incredibly fast retrieval of information and increased office space when you recycle the filing cabinets for scrap metal.

M-Files-Logo-Blue-High-Resolution.png

If you’re still using a manual process that involves printing, stamping, signing and shuffling paper around your organisation for approval that can be short circuited by e-mails, you are at risk of being scammed. If you think it won’t happen to you, think again as the Federal Government have been briefed on the severity of this trend because the losses are increasing into the millions. If you want to know more about how M-Files can help your business, please contact us.

Artificial Intelligence for a Repository Neutral ECM

helloquence-61189.jpg

Artificial Intelligence for a Repository Neutral ECM

On a recent trip to California I discovered how convenient instant access to information from Google Assistant on my Pixel phone was to help make decisions in a place I was completely unfamiliar with. While navigating to the next stop I could ask for ‘places to eat’, ‘gas stations’ or ‘tourist stops’ and have suggestions, from data scattered all over the web, presented in real-time in Google Maps. Imagine if your ECM could do the same and present information and search results from all the different systems and repositories in your organisation in one simple familiar interface.

For this to work the ECM would need a common interface that connects to your CRM , Accounting System, shared network drives, file syncing services like Dropbox and OneDrive, e-mails and SharePoint as well as some way of reading all the content in those repositories and intelligently storing metadata to allow you to search on it. Combine the ability to add your own metadata to those items while preserving the content from its original repository so it doesn’t stop its use in the original system and you would have a very user friendly, ‘Repository Neutral ECM’ where the context is more important than where something is stored.

The figure on the right provides an overview of the ‘Repository Neutral ECM’ architecture that M-Files will release later this year with a vision that ‘Context is King’

repos.png

The ‘Unified User Experience Layer’ is the ‘simple user interface’ that provides a single familiar user interface to interact with information regardless of the original repository. Think of it as Google Maps. This includes simple user access from any device including mobile apps for phones and tablets in addition to PCs. Just like Google Assistant’s ability to present outside information in Google Maps by simply asking, having a simple user interface means the user doesn’t need to learn other systems to be able to find relevant content in them and they can add their own metadata without stopping it from continuing to be used in the original system.

The ‘Multi—Repository Backend’ connects with the organisations repositories and systems via ‘connectors’ that include a set of core ‘out of the box’ connectors for repositories like network file shares, Office 365 and SharePoint, but also allow third-parties to develop connectors for other repositories and systems. This allows organisations to preserve legacy systems and avoid expensive integrations or migrations to new systems just to add functionality.

The ‘Intelligent Metadata Layer’ (IML) contains the intelligence components and multi-repository search along with the typical capabilities of an ECM such as search, dynamic views, workflow, security, version control and check-in/check-out. The intelligence components support automatic classification and metadata suggestions using text analytics. Like the Multi-Repository connectors, third-parties can add ‘metadata providers’ for specific industries or use cases. Along with text analytics, this layer includes machine learning to help improve suggestions based on user behaviour.

The power behind IML’s ‘Intelligence Components’ comes from the integration of Artificial Intelligence (AI) from Abbyy into M-Files. Abbyy produces Artificial Intelligence technologies based on textual content capture and OCR. This AI technology allows text to be understood and interpreted based on its content using algorithms that analyse the meaning of the words and the relationships between them. This allows accurate classification of complex and unstructured data in real time.

iml.png

It’s exciting to see this automatic classification and metadata tagging in action, drag and drop a document into M-Files and you’re presented with ‘tags’ or ‘suggestions’ that you can click on to populate the metadata fields. Similar to how Google Assistant effortlessly presents pins on Google Maps of suggestions from your request on ‘places to eat on my route’.

The ‘tags’ are based on the content of the document being passed through the Intelligence Services in IML and returning matches. If you don’t like the suggestion you can still select metadata as you would in the past and the AI learns from your behaviour. This technology will improve the efficiency and accuracy of data typically entered by humans as the suggestions help you make the right selection.

The benefits of IML don’t stop at metadata suggestions, there’s also the External Connectors to other repositories. We’ve all used Windows folder search and most likely found it painful at the best of times, especially if it’s a network share. This is where IML’s External Connectors can help, because the content is indexed by the ‘Connector’ you can use M-Files powerful search feature to quickly locate a file based on its content rather than where you think it might be stored. It’s lightning fast and allows you to add your own metadata to any object from any repository to help you manage your information better. Having a connector for every repository in your organisation is a powerful concept that is difficult to ignore.

ia.png

The Intelligent Metadata Layer allows organisations to have a true Repository Neutral ECM by providing Intelligent Services and External Connectors that present information from all the different systems and repositories in a single simple to use interface. It allows them to keep their legacy systems and avoid expensive integrations and migrations while providing simple efficient access. If you’d like to find out more on M-Files and how the Intelligent Metadata Layer can help your organisation, please contact us.

 Read my blog on 5 Things to Consider when Preparing for a Respository Neutral ECM.

5 Things to Consider When Preparing for a Repository Neutral ECM

aidan-hancock-338905.jpg

1. Business Requirements

Establish the business requirements as a clear goal for your project and speak to all the departments across all locations and facilities in the organisation to get an indication on how many employees need access. One of my early projects during business requirements discovery the number of employees needing access increased to 115 from an initial 15 and fortunately the architecture scaled easily for the multi-site distribution of employees.

1.jpg

Be very clear about what you are trying to solve with each requirement and ensure that each stakeholder has had a chance to provide their list of requirements. At a recent project, it became apparent one of the biggest issues a majority of employees were having was needing information locked in a system they had no access to. This led to either using inaccurate or out of date information, or using inefficient methods to access the information through someone with a license. Management hadn’t provided access because the licenses were considered expensive and weren’t aware of the impact the work around methods were having on the organisation.

Prioritise the requirements with your project team and base the order on importance, technical complexity, risk and cost to implement. At a project where we were asked to provide a solution to standardise the handling of proprietary formulas within an organisation, several steps leading up to the conception of these formulas needed to be in place prior to work starting on the actual formulas themselves.

2. Current Information Locations

Identify all existing locations where information is stored including documents in file shares and file syncing services like Dropbox and OneDrive, databases including financial, service & CRM information and portals. A quick way to get a concise list is to ask finance for details on the software subscriptions and maintenance they pay or have paid in the past.

Establish the current and annual volume increase as well as types of information stored e.g Proposals, Invoices, Drawings, Customer Service Tickets etc… Modern ECMs like M-Files utilise compression and binary delta algorithms to efficiently store versions of documents, so your annual volume increase for migrated repositories will be considerably less. The site admin at one of my projects stated that after moving to M-Files where the chance of duplication and multiple versions of files was essentially wiped out, they went from network share storage increasing by 1TB per year to the M-Files vault only increasing by 50GB per year.

2.jpg

Determine which of these repositories need to remain in operation and which could be migrated into your ECM and be retired. We usually migrate things like legacy access databases that perform simple tasks like providing unique identifiers (e.g. batch numbers) to the ECM so it then provides the batch number as part of a workflow. You may have situations where it’s critical to preserve a legacy repository like a customer portal that allows service tickets to be raised. Its content can still be made available in the ECM for search capabilities and other purposes while its original functionality is preserved.

3. Security Requirements

Review the current levels of security within each repository that that will be accessed via the ECM and map them to one of the scenarios in the table below. The credentials used to access the external repository will be determined by the type of access specified for the connection. As an example, providing public access to Supplier and Customer lists may be necessary for all users in the ECM as this information is useful as metadata for other objects, whereas you may want to limit access to project related data to only the people in the project team. We often provide ‘metadata-driven’ permissions on project based data by including ‘project team’ metadata with the project so security access can be easily managed by the client.

3.png

The scenarios to consider when providing access to a repository via an ECM can be split into several categories:

Public

A common authentication is used to connect to the external repository, the ECM then controls access to the content via its internal security e.g Public Network Share

Public with Varying Permissions

Users and groups in the ECM are mapped to users and groups in the external repository to control access to specific content e.g Network Share with ACL restrictions to certain groups

User-Specific

The external repository dictates access rights requiring the ECM users to log into the repository with their own credentials e.g. SharePoint

4. Hosting Requirements

Determine if the system will be hosted on-premise, in the cloud or a hybrid to enable planning for hardware, review of service agreements with cloud providers or both. We’ve found to avoid delay in starting projects, development can be done on cloud servers during the process of hardware procurement and deployment, and then transferred once the on-premise environment is ready. It’s also quick and very easy to change cloud server specs to increase performance if needed.

4.jpg

Use the current volume plus expected annual volume increase values (from step 2) to determine what sort of backend the ECM requires as well as to establish storage and backup requirements. M-Files recommend using the embedded database option (Firebird) up to 50,000 objects and Microsoft SQL Server once that has been exceeded. If using Microsoft SQL Server, you also have the option of storing the file data within the database or as separate files. There are pros and cons that I’ll go through in another blog.

Size the hardware based on the number of employees and volume of data to be stored (from step 2), use the business requirements (from step 1) to help. Identify how connection will be made to each external repository (local or cloud) so connectivity can be determined either directly or whether a VPN is required. Where connectivity is difficult, it may be feasible to maintain a local copy that’s refreshed periodically or use technology that provides these capabilities.

5. Access Requirements

Establish the landscape for how employees will access the ECM keeping in mind it will become the central point to reference the connected external repositories. Most ECMs support access through Windows Desktop clients, Web Access and Mobile clients. If the ECM will be available externally, securing access via SSL or VPN is critical. On most of our M-Files deployments, our clients not only want access to M-Files via their mobile phone, but also on their laptops from anywhere! We use their SSL certificate (required for mobile access) and setup what’s called ‘HTTP over RPC’ so their M-Files Desktop Client connects securely whenever an internet connection is present. If you want to know more about setting up HTTP over RPC for M-Files, contact us.

5.png

Some ECMs support replication strategies where servers can be hosted in multiple locations and cache or replicate from a central location to provide efficient access to information. We’ve delivered successful projects where M-Files outperformed SharePoint when deployed to a customer’s remote locations as ‘cache’ servers that connect back to the main M-Files server via hardware based VPNs over 3G links. Consideration needs to be given to the technologies available to help meet access requirements.

For more information on M-Files contact us

Consolidate Your Data and Make It Easier To Access

sna.jpg

As organisations grow over the years, so does the assortment of tools that are employed for various projects and departments. This often causes a headache for employees and business owners while information can become scattered amongst several disparate systems and locations.

Generally there are different products on different platforms with different security and data requirements. Together they assist a user do their job, but they are on different servers and possibly even different locations with different access and user rights.

This is a problem that affects many organisations today, and the problem will only get worse as more data is made available to employees.

By using an Enterprise Portal organisations can optimise their information management and empower their staff with personalised information in one place, sometimes with just one click.

An Enterprise Portal can be designed to merge this disparate information into one place, ready for the user to click on a button to access as well as interact with the program. An example might be where information is gathered from the:

  • ERP system

  • Production planning and control system

  • Employee timekeeping system

  • Inventory management

to be made available to the user with a simple mouse click. An extra benefit here is that users don’t need to log into each individual system separately which saves time.

The security level is placed on the user’s login to the Enterprise Portal as to how much they see and what rights they then have within each produce. Effectively you now have one secure system that accesses all of the information relevant to that particular employee’s function.

Further, if an employee enters the number of a certain product component, all information on this component is displayed immediately on the portal page, including:

  • How this product is selling

  • What revenue the company achieves with this component

  • Whether there have been any complaints

  • An image of the component

  • How much time has been estimated for producing this product

  • How much time is actually needed to manufacture this product

The data for this comprehensive information page is compiled from different systems, and provides the validated employee with the right information just when and where they need it.

It’s a holistic view which allows employees to serve their customers and managers quickly with relevant information. When a customer calls to enquire about an orders ETA, customer service staff can access relevant information, quickly and with accuracy as it is linked to inventory and manufacturing systems.

Contact us to learn more about how the Advance team can assist with your technology needs.

5 Challenges Faced On Small Data Reporting

samuel-zeller-4138.jpg

Five challenges faced on Small Data reporting

Big data is often touted as imperative to businesses, however in recent years perhaps we have been so blinded by Big Data that we are ignoring its poorer cousin, Small Data?

Big Data simply put looks at trends, information and patterns that can be utilised to forecast as well as give an overview of how your business is tracking. Big data takes high volumes of different sets of data and displays this information in a way that management can make decisions quickly and efficiently. Usually Big Data is generally generated outside of the business to assist the business make decisions moving forward.

Small Data on the other hand allows for the business to extract transactional information from data sources that end users can make use of immediately. Its focus is on providing information to the end user, so they can take action right now. It allows users to be able to determine why things happen, analyse this in real time and then take corrective action. Small Data can be generated as a sub set of Big Data or from other non-traditional data sources. The main thing to remember here is that it helps the end user achieve a result.

Big Data and Small Data each have their place in the business aiming to make inroads into improving decision making ability and resolve problems.

Formulating a plan to extract Small Data that suits each need within the company is paramount. If you ignore Small Data over Big Data then you are robbing yourself of some analytical tools that can help your company develop and improve.

Challenges facing managers looking at developing tools that allow Small Data reporting is:

  • what type of data is required?
  • where will it be obtained?
  • who requires it?
  • what format is it required?
  • how will you extract the data?

The best methodology is to look at the problem you have and work backwards from that point.

As an example let’s look at the problem statement “Average Days Debtors take to pay have increased”. If we look at our challenge we can see that want to interrogate each customer and determine what the payments days are for each invoice payment has been made against (What). We check with accounts and find that this data can be retrieved from their SAP Accounts database (Where). It has been determined that Accounts Staff and Sales Account Managers will use the data (Who), accounts to chase up overdue accounts, and sales to check credit terms prior to selling. The decision then needs to be made as to what format they want to see the data in (What). An example may be a program that can run real time analysis of the accounting data and display that to screen. Selecting the right tool to extract and display this information is paramount to ensuring that the tool gets used (How). There are many good Business Intelligence tools that will allow quick extraction, analysis and display of the results the user requires.

As they say “look after the pennies and the pounds will look after themselves”. In other words Small Data can and will affect Big Data if looked after properly.


5 Public Cloud Myths Exposed

christian-lohner-194533.jpg

 

The public cloud is a hot topic in IT today. Even though it has been around for about ten years, cloud offerings from AWS, Azure and Google cloud have made the public cloud more mainstream and easier to get onto. In some instances though companies are jumping on board without really understanding it. So in an effort to debunk some myths here are five myths to consider if you are contemplating moving to the public cloud:

1. Public Cloud is Cheaper

The AWS/Azure public cloud “pay by use” methodology was a huge game changer for companies jumping onto the pubic cloud, but there is an assumption that “pay by use” will automatically make the subscription cheaper.

It can in some instances, but it should be noted that in many cases High Availability environments will usually come out cheaper with a hosting provider rather than a public cloud option. Data out transfer costs and dedicated resource costs both come into play in a big way in a High Availability environment, and things can get very expensive, very quickly. Many companies have tried out the public cloud and have gone back to dedicated resources in a managed cloud where the investment is more reasonable and consistent.   

2. Everything should go to the Public Cloud

Due to the time it can take to tailor your application to the public cloud (not all applications are really built for the cloud/virtualization, much less the public cloud), not all companies environments are sitting in the public cloud. You really need to have an in-depth discussion with your IT Provider to determine what can be in the public cloud and what should be in the public cloud.

3. Full Security/Compliance Comes with Cloud Infrastructure

Security is much better in the cloud today than it has been in years past. Even though public cloud offerings like AWS and Azure offer HIPPA or PCI compliant solutions, it does not mean that will automatically make you compliant on moving to the public cloud. The infrastructure they provide to you is compliant, but once you configure your application on top of it, it becomes a completely different story.

4. Moving to Public Cloud is Simple

Some applications can be moved to the cloud simply, however putting a full environment that has not been configured and is technical within itself is a different story. Use your IT Provider or someone with the right expertise and experience to migrate the environment as it can get complicated quickly and without a good foundation getting your application to work on top of it may end up being expensive.

5. Managing the Public Cloud is Simple

Once someone has designed, built and migrated your application to the public cloud, it should be simple to manage from there – surely? You would think so but it is not the case! You really need to have your IT Provider work on maintaining, tweaking and scaling the configurations to keep your cloud “humming” along.

The simple suggestion here is to let the experts build, migrate and manage it for you. Cutting corners in the public cloud will come back to bite you.

For more information on Cloud & IT Services click here

Implementing a Business Intelligence Solution

stefan-stefancik-257625-1.jpg

Considerations when Implementing a Business Intelligence solution

As the type and number of Business Intelligence solutions have grown, considerations relating to the implementation need to be taken into account.

To help you avoid potential costly mistakes, consider the eight common mistakes organisations do make when they purchase a Business Intelligence solution and make the most out of your Business Intelligence software investment

Not properly defining the business problem the BI solution will solve.

Companies will jump in the deep end and purchase software before there is a definable problem that it will resolve. In some cases organisations will purchase simply because they have found that someone else is using it, so they need to as well.

Instead step back and look at your problem statement, what is needed to be able to resolve it, and then look for a BI solution that will give you that result.

Not getting acceptance from the end users of the system

Unless users are going to see a benefit to a BI solution offered and will use it, then the solution is going to fail. Bring in all stakeholders at the beginning of the process to ensure that not only the problem exists, but that everyone agrees that the solution offered will solve the problem.

Not factoring in security

Make sure that the BI solution you plan on purchasing has the ability to ensure that sensitive data remains secure and is not available for just anyone to look at.

Don’t be sold by the “sizzle”, you are buying the “sausage”

A common sales terms is “Sell the sizzle not the sausage”. Go past the bells and whistles and really concentrate on features that will matter at the foundation of BI solutions, data collection and integration between disparate data. Missing the nuts and bolts of what a BI solution should offer could leave you out in the cold when it comes time to grab all that data to present.

Not choosing a scalable solution

You want to be sure when looking at a BI solution that it will adapt and grow with your business. If it is already slow to query your data how will this affect your business two or five years down the road? The same could be said for the size of your data. As your data grows your BI solution will need to keep up with it. If you buy small you could find yourself looking for a better solution sooner than you think!

Not factoring in the mobile workforce.

In some cases a simple KPI displayed on a smartphone is as useful as any report that could be printed to screen or paper. Being able to put information that is easily digested at your employee’s fingertips is becoming more important as the workplace becomes more diverse geographically.

Rushing implementation

If you take one thing away from this blog, take this one! Rushing implementations is sure to cause problems down the track. Have a clear idea within your organisation about how long each phase is going to take and discuss that with the BI solution partner to ensure that everyone has the same expectations. Double check throughout the buying process that any changes have not changed this expectation as typically there I some scope creep as each party adds in or offers or wants more functionality.

Breaking down the project and prioritising specific outcomes you want to achieve and communicating them is extremely important. Also consider stages within the project and whether milestones will be based on reports being delivered in set time frames.

Insufficient training and consulting

Budget in plenty of training and familiarisation sessions for users of the system. Also factor in how much consulting / programming will be required to complete each project / facet of a project. Look for online training videos to have users get the most out of the training and in some cases replace initial training. Knowing what you are “up for” early in the buying process avoids disappointment later when the true cost is found out. Remember too that although best guesses can be used to judge how long a project or sub project may take, these can blow out based on inadequate information or improper definitions of the business process.

BI solutions can identify opportunities, highlights risks and forecast trends if properly implemented. Foresight into the whole solution process, from identifying the problems the organisations have, the selection of the BI solution and the implementation of that paramount to achieving that result.

For more information on Business Intelligence or other IT Solutions contact us

 

Preventing Data Leakage

kevin-364843.jpg

Are you at risk of leaking data?

We see the headlines on a regular basis, ‘…details of any Australian for sale on darknet’, ‘Personal details of world leaders accidentally revealed…’ These regular occurrences highlight a major problem facing todays businesses, a problem which only continues to grow.

It is hard to measure the cost to a business once an incident has occurred. Damage can go well beyond monetary values and often the biggest damage to a business can be one of reputation, with customer data making up 73% of leaked information (based on publicly disclosed breaches).

ludovic-toinel-349299.jpg

An IBM survey suggests that the average estimated cost is around 2.6 million dollars for a business to recover from such an event.

However, the question shouldn’t be ‘how much would it cost to fix’, the question is how do we prevent data leakage?

First let’s get an understanding of the leading causes of data leakage and the types of data involved.

The threat of data leakage can be split into two categories, Internal threats and External threats. As you would have guessed, Internal threats are made up of employees, contractors, business partners and others with insider access. External threats are usually cyber criminals, hacktivists or competitor sponsored attacks. It is necessary to identify that there is some middle ground, where someone inside the company can assist an external threat.

Although we have listed insiders as Internal threats, it is important to note that 96% of insider data leakages are caused by inadvertent actions often relating to malware, stolen devices and or failure to follow internal IT polices.

What’s the Solution?

The good news is that there are many technical solutions and products designed to mitigate these risks, both inside and outside the organisation.

It is imperative to build a sound strategy around data leakage, and below are key requirements for the most important aspect of Data Loss Prevention (DLP)

  1. Identify / Prioritise data – Not all data is equal

  2. Categorise data – Apply persistent classification tags to the data that allows tracking throughout the organisation

  3. Monitor data movement – Identify what processes put data at risk

  4. Communication and Policy – Develop polices surrounding DLP and acceptable use of company resources

  5. Employee Education – Employees often don’t realise that their actions can result in data leakage. A strong employee educational focus in conjunction with policies and procedures can reduce the insider data leakage risks in an organisation by up to 80%

For more information on Data Leakage or other IT Solutions contact us

Clicky