1. Parties Involved
2. Security Policy
3. Security Awareness
4. Classification of Data
5. Security Assessments and Compliance (Customer Infrastructure)
5.1. Data Center and Customer Infrastructure
5.1.1. Network Security (Firewalls)
5.1.2. DDoS Mitigation
5.1.3. Port Scanning
5.1.4. Anti-virus and Anti-Malware
5.1.5. Authority, Access and Data Security
5.2. Encryption and KMS
5.3. Event Logs
5.4. Data Backup and Retention
5.5. Disaster Recovery
5.6. Responsibilities and Rights
6. Security Assessments and Compliance (Builder.ai Infrastructure)
6.4. Password Management
6.5. Access Logs
6.6. Data Access and Security
6.7. Data Backup and Retention
6.8. Disaster Recovery
6.9. Responsibilities, Rights and Duties of Personnel
7. Audits and Compliance Inventory Configuration Audit
7.1. Security Audit
7.2. Continuous Compliance Audit
8.0 File Integrity Monitoring & Logging
9.0 Conduct Vulnerability Assessment
10. Perform Penetration Testing
11. Security Awareness
This policy applies to all Builder.ai employees and any other parties’ vendors, existing or potential customers or trainees.
We are responsible for the security of workloads running on AWS. This includes protecting instances, applications running on them, and protecting AWS credentials amongst other things. Amazon provides tools such as AWS IAM (Identity & Access Management) service, Multi Factor authentication (MFA), SecurityHub for monitoring CIS compliance, GuardDuty for continuous threat detection, AWS Config for ensuring configuration rule compliance, security groups, and AWS KMS to help with security responsibilities. The use of these tools is detailed below and in specific documentation that can be found in our company wiki.
In addition, we need to be mindful of general best practices around security and augment Amazon’s offerings with the best in class third-party solutions where appropriate.
These practices will help build a secure computing environment.
We will adhere to the AWS Well-Architected design principles:
This document will be reviewed monthly by a nominated set of Builder.ai’s technical leadership. This group will be referred to as the security review board.
As part of this review process, any gaps discovered as per ongoing security trends, customer feedback or security incidents may lead to updates of this policy.
All changes to the document should be supplied to the security review board before the monthly meeting where they will be reviewed and approved.
We will maintain named versions of this document for each change using the format major.minor:
Drafts will increment the minor version number: 0.1 , 0.2
Final document will increment the major number: 1.0, 2.0
This will be implemented by creating a “named version” in the version history of the document.
We perform security awareness at two levels:
We update our clients (and ourselves) regarding security awareness related to all security fields including the steps needed to maintain better security for their AWS infrastructure.
Our recommendations to customers:
At Builder.ai we follow a consistent method of classification of data to ensure consistency in handling when the information is shared within the organization and external bodies. The documents and data that need to be protected are classified under the following categories:
Builder.ai hosts and manages customer infrastructure within Amazon secure data centers, and utilizes components and technologies within these data centers for security protections at multiple layers and against multiple threats. Amazon continually manages risk and undergo recurring assessments to ensure compliance with industry standards. Refer to https://aws.amazon.com/security/ for details of Security and Compliance at AWS.
5.1.1 Network Security (Firewalls)
Amazon provides tools such as AWS (EC2) firewall for securing cloud-based servers.
Each Amazon EC2 Instance is protected by either one or more security groups, which contains sets of firewall rules that specify which type of network traffic should be delivered to that particular instance.
At a higher level, we can also apply network access controls on the load balancers and at the VPC level.
By default, the firewall operates in a deny-all mode and customers must explicitly open the ports needed to allow inbound traffic. Often enough the firewall policy contains an overly permissive set of rules, which create security holes. We recommended that customer’s lockdown their firewall policy and only allow communication that is absolutely required.
Creating a restrictive traffic management and security design on a per-instance basis is their job. For example, customers should not open remote access (RDP or SSH) to all of their production instances instead they should use “Bastion Hosts” to get remote access to production instances and lock down administrative access to only the “Bastion Host” from the external network.
AWS provides flexible infrastructure and tools that enable customers to implement strong DDoS mitigations. For more details, refer to: Denial of Service Attack Mitigation on AWS – AWS Answers. We work closely with AWs support teams to quickly respond to events and enable advanced DDoS mitigation controls when needed.
Port scanning is prohibited and our infrastructure providers investigate every reported instance. When port scans are detected, they are stopped and access is blocked.
We recommend our customers to deploy anti-virus solutions to protect their system from virus and malware threats. Customers can either extend their existing on-premise anti-virus solution to AWS cloud, or Builder.ai deploys and manages the Antivirus / Anti-Malware solution for its customers.
Data at Rest: Data at rest is data that is not actively moving from device to device or network to network, such as data stored on a hard drive, laptop, flash drive, or archived/stored in some other way. Data protection at rest aims to secure inactive data stored on any device or network. While data at rest is sometimes considered to be less vulnerable than data in transit, attackers often find data at rest a more valuable target than data in motion. We recommend encrypting sensitive data at rest.
Data in Transit: Data in transit, or data in motion, is data actively moving from one location to another such as across the internet or through a private network. Data protection in transit is the protection of this data while it’s traveling from network to network or being transferred from a local storage device to a cloud storage device. Wherever data is moving, effective data protection measures for in-transit data are critical, as data is often considered less secure while in transit. We recommend all data in transit be encrypted over a protocol such as SSL.
5.2 Encryption and KMS
5.3 Event Logs
5.4 Data Backup and Retention
We have enabled EC2 server backup through AWS Lifecycle Manager and set a cloud watch event for that function, which runs at every midnight taking backup of EC2 server by making an AMI of that EC2 server.
And the second option is that we have scripts for taking backup of EC2 server, for this we have launched a minimum configuration server and have that script on that server, which runs every midnight and makes AMI of that particular server.
We have set the retention period to delete the AMI after 7 days of creation and for that we have configured lambda function and cloud watch event for that and second option is the script for deleting 7 days old AMI. And for our RDS we have enabled Automatic backup of RDS DB servers.
5.5 Disaster Recovery
We have detailed recovery plans for our internal services. Each service has a plan tailored to its design. Detailed runbook can be found here : Disaster recovery plan - Internal Only (contact us for more information)
We are able to provide consultancy to customers on setting up comprehensive DR plans. We also offer a variety of backup solutions. These will be tailored to meet specific customer requirements by our Solution Architects.
Business continuity test drills
We ensure business continuity test drills are conducted once every year. Below tests must be performed to ensure business continuity :
Firewall is deployed in on-premise to restrict access to systems from external networks and between systems internally.
Systems used by all the employees of the organization are protected with the latest version of the Antivirus system.
Local systems used by the managed services team of our organization to access customer environments are protected with full disk encryption to ensure that customer information is protected in the event of any theft / unauthorised access.
Server side encryption at rest will be defined in a separate design document.
Once data is in the cloud, whether in a public cloud or in PVC environment, customers should consider:
We have enabled EC2 server backup through lambda function and set a cloud watch event for that function, which runs at every midnight taking backup of EC2 server by making an AMI of that EC2 server. And the second option is that we have scripts for taking backup of EC2 server, for this we have launched a minimum configuration server and have that script on that server, which runs every midnight and makes AMI of that particular server. We have set the retention period to delete the AMI after 7 days of creation and for that we have configured lambda function and cloud watch event for that and second option is the script for deleting 7 days old AMI. And for our RDS we have enabled Automatic backup of RDS DB servers.
At Builder.ai we ensure appropriate configuration is maintained for all clients by setting up AWS Config rules. This is configured for each customer account as described in the AWS Config guide here: Internal document, contact us for more information. Each customer account feeds into an internal master account which is set up to trigger alerts whenever an account config changes and/or becomes non-compliant. All cases of this happening will cause a ticket to be created in our MSP teams ticketing system, and will be investigated and actioned within 24 hours or less depending on the severity.
We also set up SecurityHub on customer accounts to ensure infrastructure is, and remains, CIS compliant. Violations also trigger alerts and freshdesk tickets that will be actioned within 24 hours or less.
This is additional service Builder.ai offers to its customers as per their compliance requirements. Builder.ai conducts tool based audits to find non-compliances and take corrective actions.
The next step is to ensure the continuous integrity of critical system files, application configuration files, and application logs.
We recommend that customers implement integrity monitoring and log analysis solutions to detect any unauthorized modifications to their system components – files, registry, services, processes and critical system files.
Logging is another important component of information security. If logs are not taken, security incidents cannot be captured and if log and security events are not monitored, incidents cannot be detected.
It is important to enable logging for all components that provide visibility into our computing environments including:
Logs in Glacier are retained for another 455 days before permanently removing them. In case a customer needs to change this log retention policy due to compliance or its enterprise policy, appropriate changes are made in the retention configuration for the customer. The retention policy is captured either in MSA or a separate document duly approved by the customer.
If a customer has already established a monitoring solution and is collecting logs to a central server, then Instances running in the cloud are just another resource that must be monitored.
Firewall configuration changes may be required to allow logs from the cloud environment to reach the central log-collection server on-premises in addition to securing the data transmission path.
We conduct vulnerability tests on our internal AWS account by using Amazon Inspector. Amazon Inspector is a security vulnerability assessment service that helps improve the security and compliance of applications deployed on Amazon EC2. It automatically assesses applications for vulnerabilities or deviations from best practices, and then produces a detailed list of security findings prioritized by level of severity. Amazon Inspector includes a knowledge base of hundreds of rules mapped to common security standards and vulnerability definitions that are regularly updated by AWS security researchers. After fetching the reports from Amazon Inspector, we evaluate them and perform all the security measures that are mentioned in that report.
Once customers have created their desired security posture around their running instances, we recommend that they evaluate the security of their systems by conducting penetration testing to safely exploit system vulnerabilities, including OS service and application weaknesses.
By conducting the vulnerability assessment, customers identify the vulnerabilities but not the potential consequences if these vulnerabilities are exploited. For example, the vulnerability scan may show a SQL injection vulnerability, but when they attempt to exploit it in the penetration testing, it could reveal personally identifiable information (PII) or data that is within the risk tolerance of the organization.
Penetration testing is a very useful approach to validating the effectiveness of the defensive mechanisms. This exercise will help customers determine if their security controls implementation can withstand real world attacks. Amazon understands the critical importance of penetration testing in any secure application deployment, hence it has established a policy for its customers to request permission to conduct penetration tests. The PCI Data Security Standard, FISMA, NIST, and other legislative and industry regulations also mandate for the penetration testing. First we take permission from the AWS form, via the link below:
Then, we fill-up this form with all required information by AWS support team. After confirmation and approval by AWS, we start penetration testing.