- Barajar
ActivarDesactivar
- Alphabetizar
ActivarDesactivar
- Frente Primero
ActivarDesactivar
- Ambos lados
ActivarDesactivar
- Leer
ActivarDesactivar
Leyendo...
Cómo estudiar sus tarjetas
Teclas de Derecha/Izquierda: Navegar entre tarjetas.tecla derechatecla izquierda
Teclas Arriba/Abajo: Colvea la carta entre frente y dorso.tecla abajotecla arriba
Tecla H: Muestra pista (3er lado).tecla h
Tecla N: Lea el texto en voz.tecla n
Boton play
Boton play
34 Cartas en este set
- Frente
- Atrás
Explain File Activity Monitoring capabilities
|
File activity monitoring is similar to database activity monitoring in many respects. In both cases, you discover the sensitive data on your servers and configure policies to create rules about data access and actions to be taken when rules are met.
|
capabilities of File activity monitoring
|
• Discovery to inventory files and metadata.
• Classification to crawl through the files to look for potentially sensitive data, such as credit card information or PII. • Monitoring, which can be used without discovery and classification, to monitor access to files and, based on policy rules, audit and alert on inappropriate access or even block access to the files to prevent data leakage. |
IBM Guardium products general description
|
provide a simple, robust solution for preventing data leaks from
databases and files, helping to ensure the integrity of information in the data center and automating compliance controls. |
In which way Guardium products can help you?
|
• Automatically locate databases and discover and classify sensitive information within
them; • Automatically assess database vulnerabilities and configuration flaws; • Ensure that configurations are locked down after recommended changes are implemented; • Enable high visibility at a granular level into database transactions that involve sensitive data; • Track activities of end users who access data indirectly through enterprise applications; • Monitor and enforce a wide range of policies, including sensitive data access, database change control, and privileged user actions; • Create a single, secure centralized audit repository for large numbers of heterogeneous systems and databases; and • Automate the entire compliance auditing process, including creating and distributing reports as well as capturing comments and signatures. |
What are the Key security concepts used in Guardium data activity monitoring
|
-Policies and rules
-Workflows -Auditing -Classification |
Meaning of Policies and Rules
|
A security policy contains an ordered set of rules to be applied to the observed traffic between database clients and servers. Each rule can apply to a request from a client, or to a response from a server. Multiple policies can be defined and multiple policies can be installed on a Guardium system at the same time.
|
Describe the Process of Policies and Rules
|
Each rule in a policy defines a conditional action. The condition can be a simple test, for example a check for any access from a client IP address not found in an Authorized Client IPs group, or the condition can be a complex test that evaluates multiple message and session attributes such as database user, source program, command type, time of day, etc. Rules can also be sensitive to the number of times a condition is met within a specified timeframe.
The action triggered by the rule can be a notification action (e-mail to one or more recipients, for example), a blocking action (the client session might be disconnected), or the event |
Meaning of Workflows
|
Workflows consolidate several database activity monitoring tasks, including asset discovery, vulnerability assessment and hardening, database activity monitoring and audit reporting, report distribution, sign-off by key stakeholders, and escalations.
|
Describe the process of Workflows
|
Workflows are intended to transform database security management from a time-consuming manual activity performed periodically to a continuously automated process that supports company privacy and governance requirements, such as PCI-DSS, SOX, Data Privacy and HIPAA. In addition, workflows support the exporting of audit results to external repositories for additional forensic analysis via Syslog, CSV/CEF files, and external feeds.
|
Meaning of Auditing
|
Guardium provides value change auditing features for tracking changes to values in database tables.
|
Describe the process of Auditing
|
For each table in which changes are to be tracked, you can select which SQL value-change commands to monitor (insert, update, delete). Before and after values are captured each time a value-change command is executed against a monitored table. This change activity is uploaded to Guardium on a scheduled basis, after which all of Guardium‘s reporting and alerting functions can be used.
You can view value-change data from the default Values Changed report, or you can create custom reports using the Value Change Tracking domain. |
Describe the process of Classification
|
A classification policy is a set of rules designed to discover and tag sensitive data elements. Actions can be defined for each rule in a classification policy, for example to generate an email alert or to add a member to a Guardium group, and classification policies can be scheduled to run against specified datasources or as tasks in a workflow.
|
Databases supported on guardium 10.5
|
ORACLE
MSSQL DB2 Informix Sybase Netessa PostgreSQL Teradata BigInsights Cloudera Aster Cassandra CouchDB Greenplum Horton Works MariaDB MongoDB MEMSQL MongoDB v2, SAP ANA and HP Vertica |
OS Currently available and working for S-TAP
|
AIX
z/OS HP-UX Red Hat Enterprise SuSE Enterprise Solaris - SPARC Solaris - Intel Windows Server IBM i Ubuntu OpenSSL for UNIX S-TAP CentOS for UNIX S-TAP TLS 1.2 |
Supported web browsers for guardium
|
Internet Explorer 9 (IE9) and above on Windows 7. And turn off Compatibility View setting of Internet Explorer.
Firefox ESR 24 and above Chrome 28 and above Minimum screen resolution - 1366 x 768 |
Minimum and Recommended Resources per software/virtual appliance
|
Physical CPUs Minimum: 4 cores
Recommended: 8 cores Virtual CPUs Minimum 4 vCPUs Recommended: 8 vCPUs RAM (64-bit) Minimum: 24 GB (min) Maximum: motherboard max Recommended: 32 GB Ports (NICs) 1 Gbit or 10 Gbit per second card recommended 10 Gbit per second card can be used in 64-bit system with sufficient memory 1-4 Disk Size Minimum: 300 GB Maximum:>2 TB Recommended: Collectors: 300-600 GB Aggregators: 600-1800 GB Guardium supports smaller HD disks for integrated warehouse configurations, using datamart interfaces (10.1.3 and later). Disk Size >2 TB Disk Speed 7200 RPM to 15,000 RPM *puede manejar mas de 2 teras, pero se le serás mas complicado acceder a la info guardada y esto afectará su performance |
What are the Vulnerability Assessment Tests
|
Guardium® provides over two hundred predefined tests to check database configuration
parameters, privileges, and other vulnerabilities. You can also define your own tests. A Vulnerability Assessment may contain one or more of the following types of tests. |
what are the Predefined Tests
|
Predefined tests are designed to illustrate common vulnerability issues that may be encountered in database environments. Because of the highly variable nature of database applications and the differences in what is deemed acceptable in various companies or situations, some of these tests may be suitable for certain databases but totally inappropriate for others (even within the same company)
|
what are the Behavioral Tests
|
This set of tests assesses the security health of the database environment by observing database traffic in real-time and discovering vulnerabilities in the way information is being access and manipulated.
As an example, some of the behavioral vulnerability tests included are: • Default users access • Access rule violations • Execution of Admin, DDL, and DBCC commands directly from the database clients • Excessive login failures • Excessive SQL errors • After hours logins • Excessive administrator logins • Checks for calls to extended stored procedures • Checks that user ids are not accessed from multiple IP addresses |
What are the Configuration Tests
|
This set of assessments checks security-related configuration settings of target databases, looking for common mistakes or flaws in configuration create vulnerabilities.
As an example, the current categories, with some high-level tests, for configuration vulnerabilities include: • Privilege o Object creation / usage rights o Privilege grants to DBA and individual users o System level rights • Authentication o User account usage o Remote login usage o Password regulations • Configuration o Database specific parameter settings o System level parameter settings • Version o Database versions o Database patch levels • Object o Installed sample databases o Recommended database layouts o Database ownership |
What are the Query-based Tests
|
A query based tests is either a pre-defined or user-defined test that can be quickly and easy created by defining or modifying a SQL query, which will be run against database datasource and results compared to a predefined test value. See Define a Query-based Test for additional information on building a user defined query-based test.
|
what are the CAS-based Tests
|
A CAS-based test is either a pre-defined or user-defined test that is based on a CAS template item of type OS Script command and uses CAS collected data.
Users can specify which template item and test against the content of the CAS results. See Create a New Template Set Item for assistance on creating an OS Script type CAS template. |
what are the Entitlement reviews
|
are the process of validating and ensuring that users only have the privileges required to perform their duties.
Along with authenticating users and restricting role-based access privileges to data, even for the most privileged database users, there is a need to periodically perform entitlement reviews, the process of validating and ensuring that users only have the privileges required to perform their duties. This is also known as database user rights attestation reporting. |
Database Auto-discovery Overview funtionality
|
There are many scenarios where databases can exist undetected on your network and expose your network to potential risk. Old databases might be forgotten and unmonitored, or a new database might be added as part of an application package. A rogue DBA might also create a new instance of a database to conduct malicious activity outside of the monitored databases.
Auto-discovery uses scan and probe jobs to ensure that no database goes undetected in your environment. • A scan job scans each specified host (or hosts in a specified subnet), and compiles a list of open ports that are specified for that host. • A probe job uses the results of the scan to determine whether there are database services that are running on the open ports. A probe job cannot be completed without first running a scan. View the results of this job in the Databases Discovered predefined report. |
steps to use the Auto-discovery application
|
1. Create an Auto-discovery process to search specific IP addresses or subnets for open ports.
2. Run the Auto-discovery process on demand or on a scheduled basis. 3. View the results of the process with Auto-discovery reports, or create custom reports. |
File Activity Monitoring funtionality
|
discovers the sensitive data on your servers; classifies content using pre-defined or user defined definitions; configures rules and policies about data access, and actions to be taken when rules are met
|
Capabilities of FAM
|
• Discovery includes collecting metadata and entitlements for files and folders.
• Classification uses decision plans to identify potentially sensitive data in the files, such as credit card information or personally identifiable information. • Monitoring and collection of audit information and policy rules, and real time alerts or blocking of suspicious users or connections. File activity monitoring: • Meets regulatory compliance in a cost effective way o Automate and centralize controls, provide audit trail. o Achieve compliance with diverse regulations such as HIPAA, PCI DSS, various state-level and national privacy regulations. • Scales with growing data volumes and expanding enterprise requirements • Provides extensive heterogeneous support across all popular systems |
Use case 1 of FAM
|
Critical application files can be accessed, modified, or even destroyed through back-end access to the application or database server
Solution: File Activity Monitoring can discover and monitor your configuration files, log files, source code, and many other critical application files and alert or block when unauthorized users or processes attempt access. |
Use case 2 of FAM
|
Need to protect files containing Personally Identifiable Information (PII) or proprietary information while not impacting day-to-day business.
Solution: File Activity Monitoring can discover and monitor access to your sensitive documents stored on many file systems. It will aggregate the data, give you a view into the activity, alert you in case of suspicious access, and allow you to block access to select files and folders and from select users. |
Use case 3 of FAM
|
Need to block back-end access to documents managed by your application.
Solution: File Activity Monitoring can discover, monitor, and block back-end access to your documents, which are normally accessed through an application front-end (for example, web portal). |
Databases Discovered Report funtionality
|
The main entity for this report is the Discovered Port. Each individual port that is discovered has its own row in the report. The columns that are listed are: Time Probed, Server IP address, Server Host Name, DB Type, Port, Port Type (usually TCP), and a count of occurrences.
There are no special runtime parameters for this report, but it excludes any discovered ports with a database type of Unknown. When an auto-discovery process definition changes, the statistics for that process are reset. |
Auto-discovery Tracking Domain funtionality
|
The Auto-discovery Tracking domain contains all of the data reported by Auto-discovery processes. Click any entity name to display its attributes.
Auto-discovery Tracking Domain Entities • Auto-discovery Scan provides a time stamp for each scan operation. • Discovered Host provides the IP address and host name for each discovered host. • Discovered Port provides a time stamp, identifies the port, and provides the database type for each port discovered open. |
The basic discovery meaning
|
scan identifies the list of folders and files, their owner, access permissions, size, and the date and time of the last update. It also identifies user permissions and group permissions. Discovery supports all file types
|
Classification, how its defined?
|
is defined by decision plans. Each decision plan contains rules for recognizing a certain type of data. (Decision plans for File Activity Monitoring are analogous to classification policies for Data Activity Monitoring.) Classification includes support many types of files, including: Plain Text, HTML, Office, PDF. Default decision plans exist for HIPAA, PCI, SOX, and Source Code. You can change the classification entities from the resulting reports/investigation dashboard, using the default decision plans. In addition, you can create new plans, or modify existing plans, using the Content Classifier Workbench, a Windows application you upload to your collector appliance.
|