As the IT environment becomes more complex, it has become more important to secure Microsoft Windows Server. The most important technical infrastructures used to be housed on-premises. IT departments were responsible for monitoring them. However, they can now be located in the cloud, colocation facilities, or private hybrid clouds.
Windows Server 2016 and the changing face cybersecurity
The evolution of Windows Server is a good guide to understanding how security requirements have changed throughout the years.
Active Directory was introduced in Windows Server 2000. It unites all identity management services and processes that are essential to modern multi-site Window Server implementations. Active Directory is very important to secure. The misuse of highly privileged accounts or compromise of Active Directory domain controllers can pose serious risks to your organization’s reputation and data. Windows Server 2008 also included Hyper-V, a hypervisor that has helped to accelerate virtualization in Windows environments. Hyper-V still offers the essential functions – including the creation and management virtual machines – which make virtualization an attractive alternative to traditional physical hardware. Windows Server 2016 added Failover Clustering to Hyper-V and introduced a form of software-defined network (SDN), similar to the Azure cloud. Windows Server 2016 features make it a better choice for organizations that have moved more operations to the cloud for increased scalability and flexibility. The core security mechanisms in Windows Server 2016 are designed to protect workloads and data no matter where they are located, whether in a server closet, faraway data center, or anywhere else. The overall approach to Windows Server 2016 security was described by an OEM TV panel as “proactive security”. This means that anomalies can be detected early and addressed if necessary through a combination of measures like log analytics integrations, privileged credentials protections, and improvements to virtualization fabric.
Log analytics integrations
When working with Windows Server, Operations Management Suite has been a valuable resource. Server 2016’s log analytics capabilities are even more powerful because it can integrate security data from the platform’s more detailed logging.
These details can be added to an analytics engine along with data such as intrusion detection incidents to create a comprehensive “security tale” about all IT environments within an organisation. Security personnel should be alerted if there is suspicious or unusual activity.
Protections for privileged credentials
Administrators have a lot of access to all versions of Windows Server. These permissions are essential for troubleshooting and modifying any environment. However, they can also open the door to cyberattacks. This presents several risks, including privilege misuse/escalation and pass-the–hash and pass–the-ticket attacks.
Privilege misuse was responsible for 14 percent of data breaches reported in the 2017 Verizon Data Breach Investigations report. Pass-the-hash is a threat that has been around since the 1990s. It involves the impersonation of users by stealing password hashes from their accounts.Pass-the-ticket is a more recent innovation on the same attack vector, with the key difference being its use of service tickets to impersonate domain users.A fundamental flaw in many administrator accounts is the extent of privileges many of them afford for an unlimited amount of time. This allows for the accumulation of credentials that can be used in cyberattacks.
Server 2016 provides some important safeguards against such risks. Credential Guard is a virtualization-based security solution that protects credentials from being intercepted. Remote Credential Guard provides similar protections for remote desktop protocol (RDP). It allows secure single sign-on, so credentials are not passed on to the RDP host. This reduces the attack surface.
“There are provisions to allow for “just enough administration” or “just in case administration.”
There are provisions for “just enough administration” as well as the similar “just in case administration.” These arrangements limit administrative privileges. The number of actions required is limited and workflows are thoroughly audited.
Virtualization fabric improvements
Many new protections have been added to virtual machines (VMs), as part of Sever 2016.
Shielded Virtual Machines – BitLocker can encrypt VMs against malware and compromised administrator accounts. These situations are becoming more common due to hybrid clouds, which combine multiple environments that can span local facilities as well as remote data centers. Windows Server 2016 offers advanced, reliable defenses against these new approaches to IT.
To advance your IT career, learn more about Windows Server 2016.
Windows Server’s continued evolution has secured its place at the heart of modern IT. It will continue to deliver the security, scalability, and reliability that organizations have come to expect from server OSes.
Learn more about Server 2016 and the other important platforms for today’s IT professionals
Month: May 2022 Page 1 of 6
As the IT environment becomes more complex, it has become more important to secure Microsoft Windows Server. The most important technical infrastructures used to be housed on-premises. IT departments were responsible for monitoring them. However, they can now be located in the cloud, colocation facilities, or private hybrid clouds.
The internet-connected devices were cheaper and more versatile as a result, and their popularity quickly spread to the consumer market. Computers are no longer a part of our daily lives. They were once huge technological marvels that could only be housed and maintained by large research institutions and companies. Computing power has become so small that people can now have a wide range of gadgets to choose from.
According to Pew Research Center, over two-thirds of Americans owned a smartphone in 2015. Only 45 percent of Americans had a tablet at their home. These levels of ownership have enabled an unprecedented age of information, but it has also led to an interesting trend in the workplace: bring your own device.
This is a new concept. Employers were expected to provide all the tools their workers need to do their jobs. Innovation is essential in the business world. BYOD could change the way the office looks. Let’s take a closer look at this trend and see what companies can learn from it as well as what obstacles they face.
“ABYOD policy” means that companies won’t need to invest as much money in new hardware.
BYOD can be very beneficial
It’s important to remember that employees can bring their own devices to work. BYOD policies save companies money on hardware. Technology is constantly evolving, so it’s important to use the best technology available. It’s impossible to update the gadgets that employees use on a daily basis.
On the other hand, the average consumer purchases new devices almost every day. Many people will buy a new phone or computer every year. This can be very beneficial for the company. Employers can enjoy the convenience and speed of modern gadgets without actually having to pay for them.
BYOD actually has been proven to increase productivity. According to Cisco’s Internet Business Solutions Group, the average American BYOD user saved approximately 81 minutes per week by using their own device at work. This equates to an extra 70 hours of productivity each year.
Employers can get almost two additional weeks of work from their employees by allowing them to use their phones and computers for work purposes. There are many reasons for this, but one reason is that employees are already familiar with their devices and don’t need to spend time learning how to use them. Staff members who have a BYOD policy may bring their work home, since company data will be on the device.
There are also downsides
There are some negative aspects to BYOD, as with any emerging trend. The security of company information is the biggest concern. While workers may be able to bring this data home, it also presents significant security concerns. The first problem is the current mindset surrounding smartphone security.
Companies that aren’t prepared for BYOD face security risks. Many people don’t see their smartphones as full-fledged computers. This means that they don’t take the same precautions as they would with a desktop or laptop. According to Consumer Reports, 34% of respondents didn’t use any security measures to protect their smartphone’s information. This includes a 4-digit pin that indicates that a large portion of the population could easily lose control over their smartphone’s data by simply leaving it on a train or bus.
This is a frightening fact because it means that company information can be stolen at a moment’s notice. While malicious strangers are a real threat, organizations must also be concerned about who is using the employee-owned device. Computer Weekly’s William Long pointed out the many problems facing institutions in this area.
Long stated that these issues include ensuring that work data is not merged with an employee’s personal data [and] that non-employees such as family members who use a device do not have access to work data.”
While most employees won’t mind letting a friend use their phone, it could cause a serious breach of company data. Employers cannot leave such decisions up to their employees.
However, employees are using their own devices.
While there are obvious disadvantages to BYOD programs in the workplace, they can be mitigated if the company does the right things. The problem is that many companies don’t realize that employees will be using their own devices, regardless of company policy.
Gartner’s study of 4,300 Americans revealed some alarming results that should make administrators rethink their security measures for their company’s data. The survey revealed that 37% of those who use their gadgets for work purposes do not have the express permission of their bosses.
“Employees use gadgets that they make because it’s convenient and they don’t know about security risks.”
This is a large portion of the workforce that has information on their smartphones and laptops they shouldn’t be taking with them. This shocking realization should be enough to convince you of how important it is for your company to have a BYOD policy. This isn’t because employees are malicious or don’t care what happens to the company. They use their own gadgets because they are easy to use and they often don’t know the security risk they pose to their employer.
What can offices do for their employees?
You probably cleaned out unused programs and cleared the IE cache before upgrading to Windows 10. What about after you upgrade? It’s pretty much the same list! (Except that SSD’s do not require user-commanded defrags. The OS handles that automatically if the drive reports to be SSD. SSD’s are expensive and you don’t want them to get worn out.
After upgrading, the first thing you should do is to check your Anti-Malware program. It can be updated or replaced for Windows 10. You can update it or replace it for Windows 10. (At least, several times.)
Then, consider System Restore. These Windows 7 and 8 restore points will no longer be necessary. This could add another 10 to 30 GB. It’s not a one-way street. However, I like Windows 10 enough to continue down that road. However, Windows 10 has a new quirk: System Restore is disabled by default! Click on [Start], enter [System Restore], hit [Enter], click your C: drive (or where you put Windows), and click on [Configure]. You can then enable System Restore, give it drive space, and delete old Windows restore points. You can also save backups and System Images to an external drive. Before you delete restore points, you should unplug them. This is a must, especially during the initial break-in period.
The upgrade process creates a folder called Windows.old that is 10-30GB in size. It is not possible to delete it without creating errors. However, you can request disk cleanup by clicking the button for [Clean Up System Files], then the [Previous Windows Installation(s)] selection. If you do this, you will know that you have given up the 30-day rollback option. It’s unlikely that you’ll ever need more from Windows.old. There’s still a lot of SSD you can reclaim. You might want to wait a few weeks before you make this decision.
You’ve reclaimed space and done drive maintenance (all the cleanups to backup), and you know that Windows can’t find any more. Next, what? It’s the UI – User Interface.
Drag the Metro page from the Start menu to see all the tiles that Windows 8 has to offer. You can look at all the properties in the Start menu and Task Bar. There are no Internet Options to clear cache, reset security or manage Add-ins.
o No File Save – no menus! None of the menu options. Okay, that’s big.
To create a shortcut, you can’t drag URLs directly onto a folder.
o URLs cannot be dragged onto the Favorites Bar. Although it is easier to use the Favorites button, it is still a step back in usability.
o Print Preview has no custom zoom, no custom margins, no view sizing.
o Because you are willing to give up the amazing, new Reading View. It’s so good, you can have web pages without ads. It’s hard not to give it up.
Check your data folders and try all file types: pdf, jpg. bmp. mp3, wav. Check out the new programs to see if you like them and if they lack features. You might go back to Acrobat Reader or Photo Viewer (I love MS Office 2010 Picture Editor the most), Media Player, real Word, instead of Word Viewer, or any other program. Right-click on a file to open its properties. Click [Change] if you don’t like the program they created. Perhaps you want to test the new stuff out first. (Maybe not.)
Start menu will show you all your apps. You may need to upgrade some apps to Windows 10 or uninstall others. Consider running a VM = Virtual Machine if you have a critical app or device that you cannot replace. This requires i3 or higher CPU and sufficient RAM to run two OS instances.
We mentioned Device Manager earlier, to make sure your devices have the right drivers before you delete Windows.old. Some of your drivers may have been replaced. Some drivers will improve compatibility, speed, and function while others will lose device features (universal drivers now). To get your devices working again, you will need to roll back some of them from Device Manager, properties, and the device.
For any new items, take a look in Control Panel. For items that require attention, check out [Security and Maintenance]. You can find Windows 8 features such as [Storage Spaces] or [Work Folders] even if you have just upgraded from Windows 7. These utilities alone might be enough to convince you to upgrade from Windows 7!
Once you’ve completed all customizations to apps and UI, it’s time for a System Restore point, full Backup, and [Create System Image] (see Windows Backup under Control Panel). You can save a copy to an external or network drive.
Now it’s smooth cruisin’. It didn’t take too long, didn’t it? !
You can also call your Mom or Dad and take them along as you walk through the process. It’s called “Bonding Time”. They will love you for it.
Original written by Bill Sullivan, NHSoCal Technical Instructor, MTAx3, MCSA x6, MCITP x4, MCTS x5, MCSE x2, MCT and CCNA, CISSP., CASP., COWA. CompTIA x5
Join us at our Infrastructure Modernization Clinic! Register today and get more information by clicking here
An Infrastructure Modernization Clinic is open to you! Are you looking to bring the cloud into a data center? Do you want to deliver applications quicker? You need to migrate from Windows Server 2003 after support ends. Learn how to expand your data center using a hybrid cloud infrastructure.
Cybersecurity is a constant concern as more IT infrastructures are virtualized and migrate to public, private, and hybrid cloud environments. The act of taking assets off premises and entrusting them in third-party service providers carries at least some risk. Hardware and software that were once under your direct oversight are now the responsibility of someone else. Even though they are usually in capable and experienced hands you must be vigilant against possible cyberattacks on your critical servers, storage and networks.
These concerns are pertinent when using platforms like Microsoft Windows Server. Windows Server has been an integral part of enterprise IT for over 20 years. Its features are key to modern identity management (via Active Directory), as well as the ongoing virtualization boom thanks in part to HyperV. What security risks are there for IT professionals working with Windows Server?
Complex cybersecurity history of Windows Server
Although Windows Server security is a crucial task, it has not always been easy. Windows Server 2003 is an example. Server 2003 was widely used long after it was replaced by several new versions of the platform. This is similar to Windows XP which outlasted Windows Vista. The familiarity and the disruption costs associated with upgrading likely supported the decision to stick with an older OS, even if there were better alternatives.
This situation caused a lot of anxiety about the critical systems running on Windows Server 2003. If they weren’t updated, they would have been at risk from old malware and zero-day attacks that would not have been addressed. Windows Server 2003 usage has declined dramatically in the two years since it was made obsolete. Spiceworks had an estimated 61% market share for the aging platform in 2015, but this number dropped to only 18% a year later (although less than half of its respondents reported running at least one instance of Server 2003).
Windows Server 2003’s vulnerabilities are often not a major problem for a newer server OS that has better built-in protections and timely patches. But that doesn’t mean there aren’t things to be concerned. Windows Server 2016 was actually designed with many modern cyberattack vectors as mind.
Windows Server Security: The defenses built into Window Server 2016
A Microsoft security whitepaper about Windows Server 2016 described an attack scenario that has become more common in recent years.
They conduct preliminary research on their targets and look at social media channels. If successful, they can use spear-phishing to trick email recipients into clicking links to compromised websites. They are often not detected for long periods of time. An Accenture survey in 2016 found that 59 percent of financial service providers were not able to detect breaches within a few months. Windows Server was a target of these sophisticated schemes. Pass-the hash, pass the token and pass-the -ticket attacks all fall under this category. It is important to know how these intrusions can be stopped or detected earlier in order to reduce the breach cost, which can reach into the millions of dollars per incident, according to the Ponemon Institute.
Privilege protections in Windows Server 2016
Although we cannot cover all of Windows Server 2016 security features, one group of functions deserves more attention: Its various administrative privilege protections. Many attacks that could have been contained spiral out of control because elevated privileges are easily accessible for long periods of time. Windows Server 2016 provides many advanced security features to prevent privilege escalation.
Just Enough Administration, and Just In Time Administration limit the duration and extent of privileges. The idea is to allow legitimate administrators to perform crucial tasks using tools like PowerShell but to limit abuse, especially when permissions that might not be needed for the job at hand are not necessary. These approaches can be implemented using the Local Administrator Password solution, which works for Just In Time Administration. It stores passwords in Active Directory, protects them with access control lists, so that only a limited number of users can access them and request their reset.
Windows Server 2016 also offers Remote Credential Guard and Credential Guard. Both of these new features are similar to the previous version. They are designed to protect credential derivatives and credentials from pass-the–hash and pass–the-token attacks. Advanced Threat Analytics is another mechanism that can be used to combat pass-the-hash and detect compromised identities that may be being used by cyber-attackers.
Additional security features you should know about
Windows Server 2016 comes with Windows Defender, which provides protection against malware, viruses, and other threats to both on-premises systems and cloud-based ones. Secure Boot is available to ensure that only software that is trusted by the device manufacturer can start; this helps to curb rootkits, and other low-level attacks that are often derived from unsigned programs.
Control Flow Guard is a new feature in Windows Server 2016. It is designed to contain
Comparing Ethical Hackers to Penetra Testers
Both ethical hackers and penetration testers work in corporate settings. They use their knowledge of computers and security to prevent security breaches. Penetration testers can be part of an IT team and oversee all aspects of network security. Hackers who are ethical use their hacking skills to stop other hackers from attacking the system. We will also discuss some of the differences and similarities.
Ethical Hackers and their Responsibilities Penetration Testers
Hackers and penetration testers share similar daily responsibilities. They expose weaknesses in a company’s network to prevent hackers, viruses, or other security threats. Ethical hackers use their knowledge to expose security flaws in a network and fix them. The roles of penetration testers are broader. They oversee security in all aspects, from mobile applications to source codes. They may also interact with employees to ensure their safety and routine computing practices.
Ethical hackers use their hacking skills to inform companies about vulnerable areas in their networks. Learn how to become an ethical hackers here>> This involves passing hacking tests and creating worms, viruses and other malware. It is essential to have a deep understanding of hacking, computer programming, IT, and to be able to explain complex issues to companies they work with. Experience is a must, but certifications such as the Certified Ethical Hacker certification from the International Council of Electronic Commerce Consultants could be an advantage.
The following are the job responsibilities for an ethical hacker:
Keep up-to-date on the latest hacking and IT trends. Develop solutions for problem areas within a network. Fine-tune network detection mechanisms. Prepare reports that highlight vulnerabilities and possible breach points.
Security professionals who are certified as penetration testers work to prevent security breaches in companies and other organizations. They use a variety of tools to identify areas that are most vulnerable. These include code review, attack & penetration testing, threat modeling, and attack & penetration testing. It is important to communicate with employees, as these testers must inform them of the best safety practices for daily computer use. Penetration testers can work in a larger department such as IT, cybersecurity, or as contractors. While a bachelor’s degree is helpful, it may not be as valuable as relevant experience.
The job responsibilities of a penetration test include:
Develop methods to detect attacksCreate programs that test network securityDetermine company management’s needs and concernsExplain basic security practices for clientsRequired Experience
Employers may accept a candidate with several years of experience as an alternative to a degree. Employers may require applicants to have experience with Microsoft and Linux operating system. It is worth looking into opportunities to learn and work with a variety of programming languages and security software.
The ethical hacker program lasts five days and includes intense hands-on practice using the most recent hacking tools and techniques. These programs prepare individuals for the certification exam to earn their ethical hackers certification. The Certified Information Systems Security Professional (CISSP), certification covers training in a variety of topics, including security policy, cryptography, ethics, and more. These are the two certifications employers prefer. For those who want to become a penetration tester, other security certifications and IT certifications may be useful.
Computer skills are essential for penetration testers. They must be familiar with both computer hardware and network equipment. They must be able to communicate clearly in writing, as they will need to produce written reports based upon their results. They must be able to pay attention to details and solve problems to accurately evaluate the effectiveness of security systems. These skills are essential to identify security vulnerabilities and fix them. They also need to have excellent analytical skills in order to effectively review the relevant data.
Career Definition of a Penetrating Tester
Penetration testers can also be called ethical hackers or information security analysts. They are responsible for protecting computer information systems from hackers. They are responsible for running testing on networks, software, and applications. They hack in to gain access to data that is not permitted without authorization. They do this to identify weaknesses in existing systems. They work with other professionals to identify possible weaknesses and then collaborate with them to find the best solution. This could include rewriting program codes or adding security measures.
Penetration testers also review security incidents. They compile the results of their assessments and write reports about them.
Salary and Career Outlook
The U.S. Bureau of Labor Statistics (BLS), listing for information security analysts, includes penetration testers. The BLS predicts that these occupations will grow by 31% between 2019 and 2029.
One customer contacted the RISK team with an issue involving a primary competitor. This threat actor was located on another continent and had recently made public a piece of large construction equipment. The equipment looked exactly like a model that our victim had recently created. This was made more suspicious by the fact that the threat actor, the competitor, had never previously produced this type equipment and therefore had no track record in this market. The victim was concerned that the equipment’s design details had been obtained illegally and that similar compromises could also be possible with other projects.
Data breach investigations don’t always require the analysis of digital evidence. We find that traditional investigative methods are just as important, if not more, than data obtained using the most recent forensic tools in many cases.
Interviewing the chief designer engineer was crucial in this case. It helped us to determine how the design was taken. Interviewing key employees allowed us to focus on the system that the chief engineer used for the equipment model that was stolen.
Response and investigation
We arrived at the victim’s headquarters shortly after initial notification and began interviewing key stakeholders. We started by meeting with the team responsible for the equipment model, which was the subject of the cyber investigation. The victim’s design team discovered several key details and parts that were identical to their model by comparing the features of the threat actor’s recently released model. Many of these design elements were unique to the industry. After concluding that the equipment model designs were most likely compromised, we first asked for the names and contact information of the employees involved in the design project.
We interviewed the chief engineer for the project as the first employee. Interviewing him revealed that he was actively seeking employment elsewhere and might not be employed by his victim for much longer. The engineer was contacted by a recruiter via LinkedIn. This led to them exchanging email addresses.
Digital forensic analysis of the chief engineer’s system and the associated firewall logs revealed evidence of a breach related to the design plans. The system contained a PHP (scripting language), backdoor shell. There were clear indications that the threat actor had copied the file containing design plans.
Malware spotlight: Command and Control (C2)
C2 refers the communication methods used by malware to communicate its operators. C2 servers can be used to manage thousands upon thousands of infected system. By issuing one command from this system, all can be brought into action. Advanced threats often encrypt their C2 channels using Secure Sockets Layer encryption (SSL), which is used in HTTPS and Secure Shell (SSH). This encryption makes it difficult to monitor and detect threats, as well as making it much harder to identify specific commands when C2 traffic has been detected. We found one email from the recruiter that occurred just before the beaconing activity when we examined the engineer’s emails. The email contained a job listing document, embedded with malware (malware). The malware contained a known malicious Chinese IP address.
The data stolen included blueprints for a novel piece of large-scale construction equipment. Attack profiling revealed that the most likely threat actors were a Chinese hacking organization that was long suspected to be state-funded. According to intelligence sources, these threat actors were known to have carried out similar attacks on a variety victims and allegedly distributed stolen intellectual property in China to companies that were state-owned, operated or supported.
They had done their research and identified the chief design engineer for the project, who was likely to have access to the data. The threat actors established contact with the engineer via a LinkedIn profile pretending to be a recruiter offering attractive employment positions. They then began sending emails with fictitious employment opportunities. One of the emails contained an attachment with a malware file embedded. The malware began beaconing to an outside IP address when it was opened. The threat actors then installed a PHP backdoor reverse shell on chief design engineer’s computer.
The threat actors were able search the system’s data and collect sensitive information from attached USB hard drives and network file servers. The activity seemed almost normal at first glance. The chief engineer had access to all data repositories. It wouldn’t seem suspicious that he would be able to access the various files related to this project because he was so involved in it.
After the data aggregation was completed, the threat actors encrypted the intellectual property and made it unidentifiable for Data Loss Prevention (DLP). Exfiltration was then trivial, and was accomplished via an outbound HTTP connection. Unfortunately, the victim was not able to prove that it had actually lost intellectual property. The victim’s suspicion that a foreign competitor had used the data to market a remarkably similar piece was confirmed.
Recovery and remediation
It is estimated that there will be 75 billion connected devices by 2025 in what is being called “Internet of Things”. With advancements in microprocessors and sensing devices, as well as software, almost anything that can connect will soon be possible.
The Pentagon’s Defense Against Cyber Attacks
Here’s what you need to remember: Seven years ago, DoD createdComply to Connect (C2C), as a way of protecting its growing network endpoints.
The Internet is experiencing a rapid increase in the number of devices. The Internet connects everything with electronics and sensors. This includes your phone, computer and video game console. It is expected that by 2025 there will be at most 75 billion connected devices in the “Internet of Things”. The advancements in microprocessors and sensing devices, as well as software, will allow almost anything to be connected.
It is not surprising that the IoT has expanded to government networks, especially those managed by the Department of Defense. Everything at DoD, from motors to battlefield sensors and door access readers, may have a network connection necessary for it to complete its task. DoD has a variety of consumer devices that run on its networks. These include printers, video monitors, cameras, and refrigerators. These devices communicate constantly with each other, as well with higher headquarters and the Pentagon. This is known as the “Internet of Battlefield Things” or IoBT by some observers. Experts agree that the military that creates the IoBT first will have a significant advantage over its competitors.
Although the evolution of the Internet into IoT/IoBT is generally a positive development, it also brings with it a significant cybersecurity challenge. Simply put, adversaries will have a greater chance of gaining penetration if there are more devices on a network. There have been many news stories about our adversaries trying to penetrate the U.S. critical infrastructure. This includes our power grid, government networks and elections systems. Hackers often seek out easy ways to access our networks via connected devices. Hackers were able to access implantable cardiac devices at St. Jude’s Hospital in 2016. Hacking has been a problem with baby monitors.
The adversaries of this Nation are aggressively trying penetration into the networks, systems and individual weapons of DoD. The network’s large number of devices generates more classified, critical information. The military discovered recently that it was possible to track the movements of troops by looking at the fitness trackers worn by many personnel. The risk of intrusion and compromise of classified information increases as more devices are added and removed from the IoBT.
Cyber-attacks are increasing at an alarming pace due to the exponential growth of IoT/IoBT devices. Attackers are finding it increasingly easy to gain access to an organization’s network through compromised IoT/IoBT devices. A device is “whitelisted” onto a network, which means that it is identified as trusted. However, the trusted device can be used to execute commands within your firewall. This can allow hackers to perform reconnaissance and possibly gain access to higher-value parts of a system. Unrecorded or unauthorized devices are also being added to networks, increasing the likelihood of penetration. Adversaries can hack vulnerable devices to not only get sensitive information but also to physically compromise your system, such as in a time when you are at war. Device vulnerability is a growing problem as the IoT/IOBT expands.
What is the DoD doing to address this growing vulnerability? What is the DoD doing about this growing vulnerability?
1) Identifying and validating devices that are connected to a network.
2) Assessing their compliance with DoD security policy;
3) Continuous monitoring of these devices is essential.
4) Automatically address device issues, thereby reducing cybersecurity administrators’ need to maintain cyber hygiene.
C2C combines existing cybersecurity technologies with more modern technologies to address the changing nature DoD’s network architecture. C2C’s core principle is to understand what devices and people connect to DoD networks, and what their security postures are. This information allows commanders to make informed decisions about the security of these connections and then automatically manage them according to security policies. C2C provides DoD with a means to continuously monitor the status of networks and devices, both computing and non-computing, with a high degree fidelity. C2C data will be fed into a central console, which will give these leaders complete situational awareness of the major areas of risk. This will help with policy making and resource allocation.
The DoD will not know how many industrial controllers, printers, or refrigerators are connected to its networks without C2C. It won’t be able to identify where its Windows patch management tools stopped working. It won’t be able to tell if Kaspersky- and Huawei-made equipment has been removed from its systems, as required by Congress. It won’t be able to channel network information to decision makers. These fundamenta are not available.
The National Highway Traffic Safety Administration has sent a recent message to Google allowing the driverless car initiative to move forward yet another step. Paul Hemmersbaugh, NHTSAChief Counsel, wrote the letter. According to Reuters, the regulatory body now considers computer systems onboard these vehicles to be the driver.
Although this doesn’t remove all obstacles to mass deployment of driverless vehicles, the recognition by the NHTSA is a huge step forward in the technology.
AI is a huge leap forward
This may seem like a straightforward development, since the computer is clearly driving the vehicle. The NHTSA’s letter to Google shows that regulators are beginning to accept the possibility of self-driving cars in the future.
This correspondence has the first apparent ramification of how blame will be shared in the event that an accident occurs. The letter doesn’t address fault in detail but the NHTSA considers the car to be the driver. This means that passengers inside the vehicle won’t be considered to have been “behind it” when a crash occurs. Google and Volvo have stated publicly that they will assume full responsibility for any accident involving one of their driverless cars.
These self-driving cars could reduce the number accidents caused by human error. Google’s cars are perhaps too cautious. In 2009, a Google car was stuck at a stop sign because other human-driven cars wouldn’t stop. The car had to wait until its safety software told it to move. It is difficult to program a simple thing like rolling through a stop sign in a computer system. However, this example shows the level of safety these vehicles observe while driving.
There are still many obstacles to overcome
This is a huge step forward for the driverless car sector, but there are still many challenges before self-driving cars become a commonplace on the roads. One of the issues is that many drivers are reluctant to lose control of their cars. Google is reminded by the letter that there are many regulations that require auto manufacturers to include objects such as steering wheels and foot brakes into their cars.
These may seem like essential equipment, but Google doesn’t like the idea that humans can take control of their cars whenever they want. The Reuters article noted that the NHTSA is fully aware of this concern. The letter stated that regulations must be changed if Google wants autonomous cars.
Google still has a long way to go before its car is mass-deployed, but this latest development from NHTSA shows that regulatory bodies recognize this technology’s viability.
Cisco at the forefront of IoT development
This recognition of self-driven cars is a significant milestone for automakers, but it also shows how the Internet of Things trend continues to gain momentum. A driverless car would absolutely need an Internet connection, a technology Cisco is constantly innovating.
Cisco is showing serious interest with recent acquisitions and its Internet of Everything initiative. An infographic created by Cisco for Wired discusses the benefits of a connected vehicle.
The graphic shows that these cars could “talk” to the computers at mechanic shops and tell them about problems with their functionality before they become major problems. Google and other driverless car manufacturers would be able to point out the safety features of their vehicles. A human driver would not be able to detect a mechanical problem before it started affecting their driving.
Cisco is well-known for its networkworking abilities, so it makes sense that Cisco would move into IoT. Although it is impossible to predict what technology will be used in the future of driverless cars it is clear that Cisco is doing everything possible to position itself within this revolution.
Are you interested in Cisco training? New Horizons offers three levels of Cisco certifications: associate, professional, and entry.
Cisco Training & CertificationsWhether you’re new to Cisco or ready to advance your existing Cisco skills, New Horizons Learning Group can help you obtain three levels of Cisco certifications, including entry, associate and professional–all accreditations that are highly valued by employers and networking professionals worldwide.Cisco Moves Deeper Into the IoT with Jasper PurchaseIt’s a great day for the Internet of Things and those who like to trumpet its expected success. Cisco announced it will purchase Jasper Technologies, which is a company that helps enterprises connect their commodities to Internet. The IoT is about connecting devices that were previously disconnected from the Internet to the network. This allows them to communicate with one another and makes your life easier. Women are needed to fill IT job vacancies. The market for information technology jobs has remained stable, even in recent economic downturns. Aspiring IT professionals who have basic computer training and are certified can find new opportunities in the technology sectors of big data, cloud computing, cybersecurity, data center consolidation, and other areas such as big data. For example, the rise of big data is creating new opportunities for IT workers who have DB2 certification. Likew
IBM’s DB2 certification can be a great way for you to validate your skills. It’s also expanding the possibilities of Data Analytics. Anyone who shops online knows the amazing ability of Facebook and other popular Web pages to track what you’ve been doing.
It’s a little creepy at first. How can they know? Am I being watched? It becomes second nature after a while. Some people are curious enough to take the time to learn how cookies work and how they can be cleared. However, knowing that you have been shopping for boots and specifically Wolverine 100 Mile boots at Zappos is only the tip of the iceberg. These are four fascinating, creepy, and cool examples of modern data analytics:
1. Breaking the news about a recent pregnancy
Target received a complaint from a father who was concerned a few years back. His high school-aged daughter was receiving advertisements in the mail about maternity wear and other items related to pregnancy. This man believed that the company was encouraging his teenage daughter to procreate, even though she was not pregnant. The customer was confused and representatives offered their sincere apologies. They even called back again to apologize for the mistake. The conversation turned and Target was contacted by the father. His daughter was indeed pregnant.
The question is: How did Target know this? Forbes claims that it wasn’t because they were spying on her. A representative explained that the company had linked interest in unscented lotions and certain vitamins supplements to expectant mothers through data analysis. Although it is somewhat creepy, this is quite amazing. These predictive analytics capabilities can be applied elsewhere and the possibilities are endless.
2. You can get rid of the’six degrees’ thing
According to Facebook, you may be connected to anyone in the world by a distance of more than 3.5 degrees. The social media portal is being used by more people than ever before, reducing the distance between people.
Facebook stated in a blog post that it was difficult to calculate this number across billions and hundreds of billions friendship connections. “[W]e use statistical methods…to accurately estimate distance based upon de-identified, aggregated data.”
To be fair, researchers at Cornell and Italy’sUniversitadegli Studi di Milano narrowed the six down to 3.74 in 2011, according to USA Today. However, it is reasonable to assume that the human population has increased since then. This is why it’s telling that the number of people has declined due to social media. It’s also a great example of what data analysis can teach us.
3. Chicago: Kill rats
This actually happened. Fortune and all other publications wrote about it. According to the report, the Windy City spent twelve years collecting data on rodent-related complaints. These data also collected grievances about other issues, such as graffiti or smelly dumpsters. But the rats dominated the data analytics. Data engineers used the vast amounts of information they had accumulated over the years to predict rat breeding areas.
Brenna Berman, Chicago’s chief information officer, stated that they discovered a really interesting relationship that led them to develop an algorithm for rodent prediction. It involved 31 variables that were related to calls about food poisoning in restaurants and overflowing trash bins.
This information was then used by the city’s sanitation department to combat rat infestations 20% more effectively than other tactics. Fortune noted that this isn’t the first time Chicago has used predictive analysis to solve a problem. The city has been using 911 data for the past three years to try and predict crime patterns. One time, it was even able to prevent a shootout.
4. Find a job
Data analytics can be used to help you predict customer pregnancies, fight crime, kill rats, and determine the degree of human interconnectedness. Qualified data scientists will be needed in all industries and verticals, as long as this continues.
Basic computer training can go a long way to this end, but when backed up by formal DB2 training orCognoscertification,new career possibilities can be added to the list of things that data analytics can accomplish. Machine learning, predictive analytics, and business intelligence are some of the most popular technologies. They will continue to be so for many years. There is no better time than now to get into data.
To find out if a DB2 certification might be right for you, check out New Horizon Learning Group’s IBM training today!
IBM Authorized Training New Horizons Learning Group has been authorized to offer IBM software training. New Horizons Learning Group has partnered with LearnQuest to offer authorized training for the design and installation of IBM technology, storage, and hardware. New Horizons Learning Group is now able to use the award-winning IBM technical training content, which is only available to global training providers. The definition of big data is large amounts of data that can easily be analyzed to identify patterns and predict trends. There are few technologies that can be as disruptive as cloud computing. This innovative service offers many benefits to a wide range industries.
Microsoft Azure is quickly becoming a standard in modern IT. Many IT professionals now rely on cloud computing services instead of purchasing, implementing, and maintaining their own infrastructures for compute and storage. Azure is one of the most prominent solutions for both infrastructure- and platform-as-a-service, trailing only Amazon Web Services in most surveys of the largest cloud platforms in terms of revenue, market share and total users.
Azure is more than a single product. It is a collection related services. Azure Stack is a system that allows you to run Azure on-premises in the privacy of a company’s data center. Azure IoT Hub is designed for monitoring the many sensors and devices that make up the Internet of Things.
It can be overwhelming to try and master or even specialize within Azure at first. The Azure ecosystem is so vast and varied. AWS is more popular than Azure, so Azure is not the only IaaS/PaaS platform many IT professionals have to learn. There are some areas of Azure that are particularly useful for building skills that will be valuable in the job market. New Horizons Computer Learning Centers offers special Azure courses for AWS specialists.
What Azure skills should I focus on? These three skills are especially useful today:
1. Azure SQL Database
Azure SQL Database is often marketed as a cloud-based database, or database-as-a-service. These terms refer to the fact that the underlying infrastructure of Azure SQL Database is managed by Microsoft at its facilities rather than by the customer. This arrangement is ideal for app developers as it minimizes capital expenditures and allows them to be flexible with expenses and scalability.
SQL Database is adaptive. This means that it adapts to the development patterns of each individual user and optimizes its performance. If you work for a software-as-a-service provider, SQL Database is also a reliable tool for isolating, managing and securing the application instances of every individual customer in a complex multi-tenant environment.
Microsoft recently expanded its Azure database offerings to attract more developers. The company launched the Cosmos Database, a NoSQL database, which is perfect for “planet-scale” apps, programs that need to reach users in multiple places around the globe. It also introduced PostgreSQL and MySQL services, as well as an updated SQL Database called Managed Instances that allows for data migration to the Azure cloud.
2. Azure Bot Services
Chatbots have been a hot topic within consumer-facing tech since years past. Microsoft, Facebook, and other companies are exploring the possibility of having these automated chatbots assist with everything, from customer service requests to online payment processing. TheMicrosoft Bot Framework had 130,000 developers as of May 2017. This is an increase from 45,000 in September 2016.
Bots can be published to apps like Skype for Business, Bing search engine, and Cortana virtual assistant. They can seamlessly integrate with third-party services. These bots can be managed seamlessly from Azure Cloud via Azure Bot Services.
Like applications, bots also benefit from the scalable storage and tight security that comes with a cloud platform like Azure. Azure Bot Services offers pay-as you-go plans, automatic scaling, security patching, and templates that make it easier to create new bots.
3. Azure Active Directory
Integrating the new services with existing infrastructures on-premises, especially those that govern identities, is one of the biggest challenges in Azure deployments. Businesses are increasingly turning to hybrid and public cloud for cost-effective, flexible IT operations. However, many organizations cannot abandon the systems they have in place.
Integration is one of the biggest challenges in Azure deployments.
This is a common problem. Active Directory Domain Services is used to manage all user, computer and application identities within the corporate security perimeter. This Active Directory version is usually hosted on-premises. It is not available for use with the increasing number of applications hosted in the cloud. This can cause latency problems as requests must traverse different infrastructures before they are approved or denied.
Azure Active Directory is the answer to these performance problems. It can be used as a replica of your existing on-prem directory with all the same objects, identities, and permissions. Any changes made locally will be reflected in the cloud, but not vice versa. Azure AD Connect handles integration. Alternative to this system is to create a virtual machine in Azure that runs Active Directory Domain Services. This may include a virtual private network, Azure ExpressRoute service, or a virtual private network.
New Horizons Computer Learning Centers invites you to dive into Azure
We’ve only scratched the surface on what Azure has to offer. There are many other services available in Microsoft’s cloud. New ones (such as those additional databases we mentioned earlier) are constantly being added. Find a New Horizons Computer Learning Center in your area today, whether you are an AWS expert who wants to learn Azure or a complete beginner to IaaS/PaaS.