Discovered by a user of Reddit, as these things often are, it’s emerged the Epic Games Launcher scans for your Steam install during each start-up and then grabs a snapshot of user files in the Steam Cloud, including data on game saves, play history, Steam friends lists, name history, and groups you’re part of.
In accordance with GDPR, you can request Epic removes all of your personal data, or they could face legal ramifications.
Steam Cloud data is stored locally in Steam>userdate>[account ID]. Epic feeds into this, pulls the data and then creates an encrypted copy which is placed into C:ProgramDataEpicSocialBackupRANDOM HEX CODE_STEAM ACCOUNT ID.bak
The purpose of this appears to be to provide friend suggestions in the Epic Launcher, effectively linking the two systems up. This is done with the user’s express permission according to Epic. It’s tucked away into the lengthy agreement when installing the Epic Launcher and signing up for an account.
A new report from Microsoft found that phishing attacks increased 250% over the course of 2018. According to Microsoft’s Security Intelligence Report (SIR) volume 24, attackers have shifted tactics and are now targeting multiple points of attacks within one campaign.
“Hacking is a multi-billion-dollar industry. If it was being run by one company rather than a mix of organized crime syndicates, lone wolves and governments, it would be comparable to a major NASDAQ tech business,” said Colin Bastable, CEO of cybersecurity test and training company Lucy Security.
Alas, malicious actors continue to find success using new tactics, like transitioning from URLs, domains and servers to dispersing emails and hosting phishing forms.
Security researchers have concluded that a Chinese-made baby monitor sold on Amazon is riddled with vulnerabilities, confirming a mother’s suspicion that her device had been hacked to spy on her infant.
SEC Consult said the FREDI-branded device, which is designed to look like a puppy, is most likely the work of an OEM called Shenzhen Gwelltimes Technology Co., Ltd.
The device has a P2P cloud feature which allows supported smartphone and desktop apps to connect to it via the cloud, making it easy for users to interact with it without needing to be on the same network. There are also no firewall rules, port forwarding rules or DDNS setup, SEC Consult claimed.
Spurred by business needs – to protect, prevent, be resilient, agile and survive – Business Continuity capabilities allow the mission critical functions of the business to continue operating during and after a disruptive event.
Today’s integrated digital networked economy – and as business operations increasingly depend heavily on IT and digital infrastructure – even a few hours of service disruption can have a devastating impact for most businesses.
Business Continuity planning and Disaster Recovery capabilities are therefore more relevant now than ever before, as businesses contend to become more resilient and agile in an increasingly uncertain world. In fact, governance and regulatory requirements, compliance against global best practice standards, supply chain concerns and the potential impact on revenue, brand and reputation, have all helped move business continuity higher on the boardroom agenda.
However, businesses are also beginning to realize that their existing plans and procedures may no longer be adequate in light of these new threats.
The need for a business continuity plan within a company is crucial to ensure the continual running of the business after a disaster.
It is a well-known industry reality that for every five businesses that experience a disaster two will go out of business within five years from the disaster date. This makes stark reading and underlines the importance of ensuring businesses have a robust, organised and manageable business continuity plan in place for such disasters.
Disasters can come in various guises ranging from acts of terrorism to the threat from natural disasters and with the competitive landscape becoming more ruthless, business plans and strategies have a great significanceIn today’s business environment, disasters can come in various guises ranging from acts of terrorism to the threat from natural disasters and with the competitive landscape becoming more ruthless, business plans and strategies have a great significance, as they are the plans that will ensure the future trading and success of the business.
When a business decides to move into the cloud, they are placing their business data, applications and software in the hands of a cloud computing provider. It is therefore important to ensure that this provider has appropriate business continuity and disaster recovery plans in place because when a company transfers to the cloud environment, they are placing a significant amount of their business resources within this environment. It is crucial to ensure and understand what will happen if that cloud environment becomes unavailable due to disasters either natural or deliberate.
If a disaster occurs then the availability of all resources will be affected and it could be hard to get questions answered regarding uptimeA business therefore needs to ensure that they choose a cloud computing company that is robust and one that they can trust and there are many reasons why this is important. The main point is that cloud users will often have little or no knowledge of what technology is being used and the security that is employed, so they are putting their whole trust in the cloud providers IT systems and infrastructure. The cloud users also will have no control on the management of this infrastructure or what data centers are used. Many cloud computing companies rent their data centers and the infrastructure that they employ from other cloud providers. This means that if a disaster occurs, then the availability of all resources will be affected and it could be hard to get questions answered regarding uptime etc.
Disaster Recovery Plans should include:
Names, titles and functions of every team member who has a role in the disaster recovery process;
There has to be a core team with a head which has the authority to take charge in the event of a disaster;
Complete Contact details of members and alternate contacts;
Contact details of vendors, support agencies, consultants etc with whom disaster handling agreements have been entered into;
Should clearly state what defines a disaster situation;
Plan should address minor as well as major disasters;
Employee training for skills needed in the salvaging and resuming phases during and after the event;
List of functions that are critical for business continuity;
Priority of the functions;
Contracts/ agreement copies with disaster support vendors like salvage/ reconstruction consultant firms, alternative office sites, equipment vendors and so on;
Contact details of fire department, police, ambulance etc;
Floor plan/ blueprint of the building in which the organization is functioning.
By incorporating the above items and any thing more that might be relevant a complete and effective disaster recovery plan can be brought out.
With the right training to all concerned parties and periodic mock tests it is possible to recover successfully from a real life disaster.
Twelve years ago, Amazon launched Prime, a subscription service that entitled members to free two-day shipping in the United States.
Since then, it has added a number of options to make delivery faster and more convenient. Prime customers can get same-day delivery, and drop off with an hour or two on some items. Of course, customers aren’t always home to receive their packages. So Amazon started putting lockers in nearby convenience stores and building lobbies. It even showed off drones that could drop the package right into your backyard. Today it’s taking the obvious next step and introducing a service that will allow Amazon couriers to open your front door and put your package safely inside your home.
The service is called Amazon Key, and it relies on a Amazon’s new Cloud Cam and compatible smart lock. The camera is the hub, connected to the internet via your home Wi-Fi. The camera talks to the lock over Zigbee, a wireless protocol utilized by many smart home devices.
When a courier arrives with a package for in-home delivery, they scan the barcode, sending a request to Amazon’s cloud. If everything checks out, the cloud grants permission by sending a message back to the camera, which starts recording. The courier then gets a prompt on their app, swipes the screen, and voilà, your door unlocks. They drop off the package, relock the door with another swipe, and are on their way. The customer will get a notification that their delivery has arrived, along with a short video showing the drop-off to confirm everything was done properly.
The value of the Cloud Supply Chain Management (SCM) market is projected to reach $11bn by 2023, according to new figures.
Surging adoption in transportation management has been one of the major drivers for the cloud SCM market, research published by P&S Market Research found.
As the world’s transportation networks and supply chains become increasingly intertwisted and complex, the systems that support them are advancing and improving at a rapid pace.
The systems that support them are advancing and improving at a rapid pace
Software vendors have been integrating more transportation optimization capabilities into their solutions, making it easier for shippers to streamline their supply chains, while also making them more cost- and time-efficient. This has been augmenting the growth of the cloud SCM market.
During the course of the analysis, P&S found that demand planning and forecasting is projected to witness the highest growth, with 20.3% CAGR during the forecast period, among all solutions in the cloud SCM market.
Demand management solutions help to predict and manage replenishment effectively, align price and profit margins, better leverage past product performance and maintain a leaner and more profitable supply chain.
In a statement, P&S said: “Demand management solutions takes supply chain management to the next level by enabling an automated ecosystem that simultaneously maps demand forecasting against factors like financial predictions, supply restrictions, inventory counts and customer commitments, as well as patterns of behaviour that can affect demand at any given time.
Businesses everywhere, beware—what happened at Verizon can happen to you, too.
The names, addresses, phone numbers and in some cases, security PINs of 6 million Verizon customers stored on large cloud-computing servers were made available to the public, the telecommunications carrier said this week after a cybersecurity company notified it of the exposed data.
Verizon chalked the leak up to human error, saying it was because an employee of NICE Systems, one of its contractors that it uses to analyze its customer service response, made a mistake. No customer information was stolen, Verizon said, and it apologized to its customers.
The leak comes a month after the discovery that the names, birthdays, addresses and other personal details of 200 million registered voters were exposed by a contractor for the Republican National Committee. In a similar scenario, the RNC contractor had failed to ensure that the voter files stored on an Amazon cloud account were not available to public access.
More such exposures are likely until businesses, which are increasingly using the cloud to store and analyze customer data and their own content — for instance, images that populate their websites — get a firm grip on the security protections they need to place around such data.
“When you have these complex systems and you force humans to solve the problem manually, we make mistakes,” Nathaniel Gleicher, former director of cybersecurity policy in the Obama administration. “Complexity is the enemy of security.” His take: data leaks are going to keep happening until cloud storage systems become more automated and enterprises have more help dealing with systems.
Amazon Web Services, where the Verizon data was stored, operates under a “shared responsibility” model with the customer — the Amazon cloud unit controls the physical security and operating system, and gives customers encryption tools, best practices, and other advice to help them maintain security of their data. The customers are responsible for making sure their applications are secure.
It’s roughly similar to a Google Docs user setting the “sharing” setting to private, a small group, or anyone.
After uploading files into an Amazon Web Services server, a business makes adjustments to who can access the files in a certain “bucket”, and the permissions (say to edit or just view). By default, the data is set to private so that only the person uploading the files can see them. The user can widen access to various groups, including authenticated users, that is, anyone with an AWS account that has permission to access the files; and everyone.
“Use this group to grant anonymous access,” says the AWS website.The NICE Systems employee might have clicked the “everyone” category while meaning to give access to another group.
SolarWinds recently released its 2017 IT trends report, examining the effects of increasing cloud adoption on IT and the shift to a hybrid infrastructure.
The SolarWinds IT Trends Report 2017: Portrait of a Hybrid IT Organization, released Wednesday, highlights IT’s continual move to the cloud, especially for key workloads. According to the report, 95% of IT professionals surveyed said they had “migrated critical applications and IT infrastructure to the cloud over the past year.”
However, despite the growth of cloud deployments for key applications, IT budgets aren’t necessarily shifting to reflect this change. In the report, 69% of respondents said that less than 40% of their yearly IT budget is spent on cloud technologies. But 59% did note that they were receiving the expected benefits from a cloud implementation (e.g. scalability, availability).
The report also said that many organizations (45%) are still dedicating some 70% or more of their yearly budget to traditional, on-premises applications. This proves that there is still a strong demand for hybrid infrastructure, with businesses utilizing public cloud platforms and local data centers.
Over the past 12 months, 74% of respondents reported that their organization had moved applications to the cloud, while 50% moved storage, and 35% moved databases. These areas were initially prioritized for migration, as they were seen to have the greatest potential ROI, the report said.
Despite the upswing in cloud adoption, 35% of those surveyed said they had initially migrated applications and infrastructure to the cloud before ultimately bringing them back on-premises. The two areas most commonly brought back on-premises were applications (19%) and databases (13%). In terms of the reasoning behind this shift, 28% cited security and compliance, and 21% said poor performance was to blame, the report said.
In addition to changing revenue and delivery models, the cloud has also changed the careers of the IT practitioners who are using it. Of those surveyed, 62% said that the cloud has required them to learn new skills, but has also allowed them to stay on the same career path, while 11% said the cloud has altered their career path. More than half of the surveyed professionals also said that their organization was actively looking, or at least planning, to hire or reassign IT professionals strictly for the management of cloud technologies.
However, an IT skills gap and increased workload were listed as two of the largest drawbacks to cloud deployments. “Nearly half (46%) do not believe that IT professionals entering the workforce now possess the skills necessary to manage hybrid IT environments,” the report stated.