×

iFour Logo

Top 10 Experts Tips for DBA to Make Database Secure and Faster

Kapil Panchal - September 11, 2020

Listening is fun too.

Straighten your back and cherish with coffee - PLAY !

  • play
  • pause
  • pause
Top 10 Experts Tips for DBA to Make Database Secure and Faster

Whenever we advise about the security to someone, we usually recommend them to follow techniques such as to keep backup of their data, maintain strong and complex passwords, avoid default settings, utilize password-protected applications, follow encryption-decryption techniques while data transfer, etc. However, these are just a few drops in the ocean of security. It is a good practice to take as many steps as possible to enhance proper security, especially for the businesses and custom software development companies because they maintain crucial as well as sensitive data inside. It is also suggested for the enterprises to keep the database server on a separate machine and keep monitoring it by installing tools like Firewall, antivirus, etc. that stands as a barrier against the unauthorized attacks.

Capable and Reliable Database Will Always Work


“We've always run Postgres databases for our primary data store and have found it very capable and reliable. I've worked extensively on every level of our application stack and the database is my favorite part. We're able to achieve great performance through extensive application and schema level tuning.

Speed: We have spent a ton of time making our queries faster. Our general approach is to profile a query using the "explain analyze" keywords to see how Postgres is making use of the existing indexes and how much work it needs to do for each step. Once we've identified the slow section we can experiment with building different single or multi column indexes and rerunning the query. For columns with low cardinality like a Boolean we've found B-Tree indexes don't help much and often Postgres will just ignore them. This can usually be solved with a multicolumn index or by rewriting the query such that the low cardinality columns are filtered out as late as possible after most of the rows were eliminated by a more efficient index. Postgres also supports other index types like Hash, BRIN, GiST, and GIN that can work well when B-Trees aren't giving you the speed needed.

It's also important to consider your application's data access pattern because just looking at query run times can be misleading. If you're making a lot of small queries the network overhead can result in worse overall performance than making one large query. For example, we may want to pull some data for every car model of a brand. Limiting our queries to just one model at a time results in very fast queries, but ends up being slower overall because of network latency. The query over the entire brand could have been many times slower, but because most brands sell a lot of different car models the overall result was much faster.

Security: In addition to the normal advice of locking down access as much as possible with IP/network access and restricting user/role access to just what is needed, we also recommend requiring strict use of parameterized queries any time a variable is used in a query to prevent sql injection. When we started building our application data security was one of our top priorities, so we built our sql interface to be as secure as possible by default. In our case every sql query made by our application through every level of the stack uses only parameterized queries. Because our application is always changing it's impossible to be sure that a given input will stay safe forever, so it's better to parameterize everything and avoid sql injection risks all together. You also get a more robust system that can handle "Wally's Autoland" having a sale.

Robust/ Maintainable/Relational: These are closely related for us. We make use of a lot of foreign keys and type definitions to keep data in sync. One place we could have done better is in naming some of our Many to One relationships. We have one table of ("objectID", "mappedID") where the mapping direction isn't immediately obvious. We don't work with this table very often, so whenever it comes up we have to stop and lookup which ID is mapped to the other. Better column names would have been ("originID", "targetID").

We try to normalize data as much as possible, but in some cases it's just easier to write queries if you denormalize a little. Denormalizing might create some problems with data desyncing, so we recommend trying to find a balance that works for you and your application. We're able to keep things synchronized with some extra checks when updating and the outcome of making queries easier to write and understand is well worth it.

As your row counts grow you many need to revisit some database wide maintenance settings. In Postgres you need to perform routine vacuuming to update table statistics and get rid of old rows. Failure to vacuum can degrade performance significantly. By default, it happens after a specified fraction of the table's rows are updated or deleted, but with large tables that fraction could be tens of millions of rows. We recommend revisiting your maintenance settings periodically to make sure they still make sense with how you're using the database. For Postgres specifically you should also monitor the autovacuum process to make sure that it's able to keep up with the number and size of tables in your database. With high write workloads it's easy to get into a situation where your vacuums can't keep up and performance suffers. In that case you'd need to increase the number of autovacuum processes or give each process more resources.”

- William Young, Founder of  Competitive Intelligence Solutions LLC.

Use Secure Connections of SSL


“To ensure the security of your database, it’s advisable to use secure connections like SSL to protect communication between a client and a server. Additional security will ensure the creation of database login blocks/permissions and anti-virus software and firewalls functioning. Database performance and speed can be improved with a well-thought-out indexing strategy that is often overlooked. Indexes not only structure the data and facilitate faster interaction with it, but also build clusters of rows, allowing you to tune your database using data clusters. No less important will be creating backups and reserving servers for the database's maintainability. This helps the business protect data in case of loss, and quickly recover the system after critical errors.”

- Maxim Ivanov, CEO of  Aimprosoft

Set Smart Indexing, and Data limitation in queries


“Here are my top 10 tips:

  1. Limit the number of accounts that access your system (secure)
  2. Limit the amount of information returned by queries (secure, fast)
  3. Ensure all accounts are traceable to an owner (secure)
  4. Utilize directory services (Active Directory, Oracle Internet Directory, etc) for account authentication. They have stronger security protocols than a typical username and password. (secure)
  5. Enforce naming conventions and standards (maintainable)
  6. Take regular backups. Use encryption and compression tools available with your database software. (robust, secure)
  7. Use indexes smartly. Indexes speed up your select statements but slows down updates if there are too many. (fast)
  8. Keep statistics up-to-date! (fast)
  9. Don't wait for a disaster to happen. Have a disaster recovery procedure ready to cover these 2 basic scenarios: loss of infrastructure and corruption. (robust)
  10. Establish baseline performance metrics. Continuously collect data about the database, the underlying infrastructure (server, operating system, storage subsystem, network) and the transactions that run against it. You need to know what your numbers are on a good day so you can proactively identify anomalies or spot a negative trend and take action. (robust, maintainable

- Fred Blair, Founder of  AwesomeHoops.com

Practice Query Optimization and encryption, Disable Multi-Access


“HERE ARE THREE TIPS TO MAKE DATABASE SECURE, ROBUST, FAST AND RELATIONAL

CONTROL ACCESSES TO THE DATABASE. When too many people play around with the database the result is not a positive one. You need to limit the permissions and privileges to your database in order to make it more secure. You need to have system permissions in order for the attacker to not access the information present in the database. Allow only certain users to make queries regarding sensitive information. Avoid trying to access databases outside the office. Also, it is a good idea to disable all services that are not in use in order to prevent an attacker from accessing it.

FOR BETTER PERFORMANCE OPTIMIZE QUERIES. Usually, performance issues occur when poor SQL query is used. It is sometimes difficult to decide whether to use IN or EXISTS and which JOIN should be used.. You can speed up your SQL queries by using query optimizers such as EverSQL Query Optimizer. Also, try using a stronger CPU as the better your CPU, the faster and more efficient your database will be.

ENCRYPT THE INFORMATION PRESENT IN THE DATABASE TO MAKE IT MORE SECURE. Firstly find out which data is critical and requires protection. Understand the architecture of your database in order to find out how and where your critical information will be stored. When an attacker will gain access to the system he will always try to get the information from the database and will definitely want to get his hands on the critical information. So once you have identified the sensitive data, robust algorithms can be used to encrypt the information.”

- Azza Shahid, Outreach Consultant at Heart Water

Planning to Hire an .NET Software Developer? Your Search ends here

 

Pre-deciding the DB Features and DAAS Services Utilization


“The truth is, you are almost never going to find the perfect database for your application. As you grow and scale, there will always be compromises that need to be made or costs that need to be absorbed. But a lot of pain can be avoided by deciding what the most important database features are at the beginning of your database design process and selecting a database type and platform that will continue to meet your needs as your application grows.

Another factor to consider is that you probably won't find a one-size-fits-all database, and that some features are best left to specialist services. For example, we decided to use Algolia to handle the search feature on Wethrift.com. While our primary database doesn't handle search very well, Algolia specializes in search alone. Adding a specialist database-as-a-service to handle this one task saved us countless hours that would have been spent trying to make search work on our primary database.”

- Nick Drewe, Founder of  Wethrift.com

 

Decide Proper Key Constraints and Data Modeling


“Here are my tips for DBAs based on the first-hand experience:

Invest time in careful planning before starting to code. Analyze the nature of data and what it’s supposed to do. Avoid software development at the same time with data modeling. For example, if the requirement is to create an analytic report, you don’t want to design for transaction processing.

Decide on primary keys, foreign keys, and artificial keys. Your challenge is to figure which column you should assign to the primary key and what values you should include. For example, some data modelers use an artificial key because of its uniqueness. But a natural key helps ensure data integrity. The challenge is when the natural key comprises many columns. As a result, if it has to amended, you will need to make changes in many places. On the other hand, if you define a table with an artificial key, such as product definition, a sequence, or a code for a short product name, you may end up with two rows with the same product code.

Make the most of the architecture with data modeling and get a clear picture of data relationships. Data modeling will help you make sure that you maximize business outcomes by using various components and don’t miss out on any of your data assets. Data modeling will make your team’s work easier because they will resolve issues faster by identifying weak links.”

- Thierry Tremblay, CEO & Founder of  Kohezion

Account Lock after Limited Attempts


It is a good standard of practice for any agency to keep limited user access for their database server. Frequent access to DB can also ruin the security barrier of work as it may include sensitive data and significant information related to the project. Therefore, limiting the DB access by restricting unauthorized users would make a lot of sense. This could be done by locking the account after three or four login attempts.

Monitor the User Activity in DB


This is also the unit of fine security maintenance in the organization. Frequent monitoring of DB login attempts by reviewing the logs regularly in the database would also help from breaching the security. Monitoring and auditing processes would help you to spot the user making suspicious activities with the data and alerts you to take action against it. Apart from that, there were also tools like DAM (Database Activity Monitoring) that can help the Administrators in auditing and monitoring the actions.

Apply Separate Servers


The organization which develops a sustainable and robust website would choose the best hosting server for it. But it is further suggested to keep separate servers with stronger security controls for the company’s data storage. Not just that, it is even better to keep a couple of validations that need to be entered while accessing the database. And make sure this separate server would also be protected with necessary security measures.

Database Segmentation


A wide-open database would have great chances of vulnerability. Therefore, database segmentation according to the roles would be a decent idea to secure the data. For example, a developer can access the basic information of the table but cannot alter it until he gains permission from the team-lead. While team-lead can have the right to access sensitive data and alter the structure of the database according to the needs.

Infographics


top-10-experts-tips-dba-infography
Top 10 Experts Tips for DBA to Make Database Secure and Faster Whenever we advise about the security to someone, we usually recommend them to follow techniques such as to keep backup of their data, maintain strong and complex passwords, avoid default settings, utilize password-protected applications, follow encryption-decryption techniques while data transfer, etc. However, these are just a few drops in the ocean of security. It is a good practice to take as many steps as possible to enhance proper security, especially for the businesses and custom software development companies because they maintain crucial as well as sensitive data inside. It is also suggested for the enterprises to keep the database server on a separate machine and keep monitoring it by installing tools like Firewall, antivirus, etc. that stands as a barrier against the unauthorized attacks. Capable and Reliable Database Will Always Work “We've always run Postgres databases for our primary data store and have found it very capable and reliable. I've worked extensively on every level of our application stack and the database is my favorite part. We're able to achieve great performance through extensive application and schema level tuning. Speed: We have spent a ton of time making our queries faster. Our general approach is to profile a query using the "explain analyze" keywords to see how Postgres is making use of the existing indexes and how much work it needs to do for each step. Once we've identified the slow section we can experiment with building different single or multi column indexes and rerunning the query. For columns with low cardinality like a Boolean we've found B-Tree indexes don't help much and often Postgres will just ignore them. This can usually be solved with a multicolumn index or by rewriting the query such that the low cardinality columns are filtered out as late as possible after most of the rows were eliminated by a more efficient index. Postgres also supports other index types like Hash, BRIN, GiST, and GIN that can work well when B-Trees aren't giving you the speed needed. It's also important to consider your application's data access pattern because just looking at query run times can be misleading. If you're making a lot of small queries the network overhead can result in worse overall performance than making one large query. For example, we may want to pull some data for every car model of a brand. Limiting our queries to just one model at a time results in very fast queries, but ends up being slower overall because of network latency. The query over the entire brand could have been many times slower, but because most brands sell a lot of different car models the overall result was much faster. Security: In addition to the normal advice of locking down access as much as possible with IP/network access and restricting user/role access to just what is needed, we also recommend requiring strict use of parameterized queries any time a variable is used in a query to prevent sql injection. When we started building our application data security was one of our top priorities, so we built our sql interface to be as secure as possible by default. In our case every sql query made by our application through every level of the stack uses only parameterized queries. Because our application is always changing it's impossible to be sure that a given input will stay safe forever, so it's better to parameterize everything and avoid sql injection risks all together. You also get a more robust system that can handle "Wally's Autoland" having a sale. Robust/ Maintainable/Relational: These are closely related for us. We make use of a lot of foreign keys and type definitions to keep data in sync. One place we could have done better is in naming some of our Many to One relationships. We have one table of ("objectID", "mappedID") where the mapping direction isn't immediately obvious. We don't work with this table very often, so whenever it comes up we have to stop and lookup which ID is mapped to the other. Better column names would have been ("originID", "targetID"). Read More: Emerging Technologies And Opportunities For Big Data Applications We try to normalize data as much as possible, but in some cases it's just easier to write queries if you denormalize a little. Denormalizing might create some problems with data desyncing, so we recommend trying to find a balance that works for you and your application. We're able to keep things synchronized with some extra checks when updating and the outcome of making queries easier to write and understand is well worth it. As your row counts grow you many need to revisit some database wide maintenance settings. In Postgres you need to perform routine vacuuming to update table statistics and get rid of old rows. Failure to vacuum can degrade performance significantly. By default, it happens after a specified fraction of the table's rows are updated or deleted, but with large tables that fraction could be tens of millions of rows. We recommend revisiting your maintenance settings periodically to make sure they still make sense with how you're using the database. For Postgres specifically you should also monitor the autovacuum process to make sure that it's able to keep up with the number and size of tables in your database. With high write workloads it's easy to get into a situation where your vacuums can't keep up and performance suffers. In that case you'd need to increase the number of autovacuum processes or give each process more resources.” - William Young, Founder of  Competitive Intelligence Solutions LLC. Use Secure Connections of SSL “To ensure the security of your database, it’s advisable to use secure connections like SSL to protect communication between a client and a server. Additional security will ensure the creation of database login blocks/permissions and anti-virus software and firewalls functioning. Database performance and speed can be improved with a well-thought-out indexing strategy that is often overlooked. Indexes not only structure the data and facilitate faster interaction with it, but also build clusters of rows, allowing you to tune your database using data clusters. No less important will be creating backups and reserving servers for the database's maintainability. This helps the business protect data in case of loss, and quickly recover the system after critical errors.” - Maxim Ivanov, CEO of  Aimprosoft Set Smart Indexing, and Data limitation in queries “Here are my top 10 tips: Limit the number of accounts that access your system (secure) Limit the amount of information returned by queries (secure, fast) Ensure all accounts are traceable to an owner (secure) Utilize directory services (Active Directory, Oracle Internet Directory, etc) for account authentication. They have stronger security protocols than a typical username and password. (secure) Enforce naming conventions and standards (maintainable) Take regular backups. Use encryption and compression tools available with your database software. (robust, secure) Use indexes smartly. Indexes speed up your select statements but slows down updates if there are too many. (fast) Keep statistics up-to-date! (fast) Don't wait for a disaster to happen. Have a disaster recovery procedure ready to cover these 2 basic scenarios: loss of infrastructure and corruption. (robust) Establish baseline performance metrics. Continuously collect data about the database, the underlying infrastructure (server, operating system, storage subsystem, network) and the transactions that run against it. You need to know what your numbers are on a good day so you can proactively identify anomalies or spot a negative trend and take action. (robust, maintainable - Fred Blair, Founder of  AwesomeHoops.com Practice Query Optimization and encryption, Disable Multi-Access “HERE ARE THREE TIPS TO MAKE DATABASE SECURE, ROBUST, FAST AND RELATIONAL CONTROL ACCESSES TO THE DATABASE. When too many people play around with the database the result is not a positive one. You need to limit the permissions and privileges to your database in order to make it more secure. You need to have system permissions in order for the attacker to not access the information present in the database. Allow only certain users to make queries regarding sensitive information. Avoid trying to access databases outside the office. Also, it is a good idea to disable all services that are not in use in order to prevent an attacker from accessing it. FOR BETTER PERFORMANCE OPTIMIZE QUERIES. Usually, performance issues occur when poor SQL query is used. It is sometimes difficult to decide whether to use IN or EXISTS and which JOIN should be used.. You can speed up your SQL queries by using query optimizers such as EverSQL Query Optimizer. Also, try using a stronger CPU as the better your CPU, the faster and more efficient your database will be. ENCRYPT THE INFORMATION PRESENT IN THE DATABASE TO MAKE IT MORE SECURE. Firstly find out which data is critical and requires protection. Understand the architecture of your database in order to find out how and where your critical information will be stored. When an attacker will gain access to the system he will always try to get the information from the database and will definitely want to get his hands on the critical information. So once you have identified the sensitive data, robust algorithms can be used to encrypt the information.” - Azza Shahid, Outreach Consultant at Heart Water Planning to Hire an .NET Software Developer? Your Search ends here See here   Pre-deciding the DB Features and DAAS Services Utilization “The truth is, you are almost never going to find the perfect database for your application. As you grow and scale, there will always be compromises that need to be made or costs that need to be absorbed. But a lot of pain can be avoided by deciding what the most important database features are at the beginning of your database design process and selecting a database type and platform that will continue to meet your needs as your application grows. Another factor to consider is that you probably won't find a one-size-fits-all database, and that some features are best left to specialist services. For example, we decided to use Algolia to handle the search feature on Wethrift.com. While our primary database doesn't handle search very well, Algolia specializes in search alone. Adding a specialist database-as-a-service to handle this one task saved us countless hours that would have been spent trying to make search work on our primary database.” - Nick Drewe, Founder of  Wethrift.com   Decide Proper Key Constraints and Data Modeling “Here are my tips for DBAs based on the first-hand experience: Invest time in careful planning before starting to code. Analyze the nature of data and what it’s supposed to do. Avoid software development at the same time with data modeling. For example, if the requirement is to create an analytic report, you don’t want to design for transaction processing. Decide on primary keys, foreign keys, and artificial keys. Your challenge is to figure which column you should assign to the primary key and what values you should include. For example, some data modelers use an artificial key because of its uniqueness. But a natural key helps ensure data integrity. The challenge is when the natural key comprises many columns. As a result, if it has to amended, you will need to make changes in many places. On the other hand, if you define a table with an artificial key, such as product definition, a sequence, or a code for a short product name, you may end up with two rows with the same product code. Make the most of the architecture with data modeling and get a clear picture of data relationships. Data modeling will help you make sure that you maximize business outcomes by using various components and don’t miss out on any of your data assets. Data modeling will make your team’s work easier because they will resolve issues faster by identifying weak links.” - Thierry Tremblay, CEO & Founder of  Kohezion Account Lock after Limited Attempts It is a good standard of practice for any agency to keep limited user access for their database server. Frequent access to DB can also ruin the security barrier of work as it may include sensitive data and significant information related to the project. Therefore, limiting the DB access by restricting unauthorized users would make a lot of sense. This could be done by locking the account after three or four login attempts. Monitor the User Activity in DB This is also the unit of fine security maintenance in the organization. Frequent monitoring of DB login attempts by reviewing the logs regularly in the database would also help from breaching the security. Monitoring and auditing processes would help you to spot the user making suspicious activities with the data and alerts you to take action against it. Apart from that, there were also tools like DAM (Database Activity Monitoring) that can help the Administrators in auditing and monitoring the actions. Apply Separate Servers The organization which develops a sustainable and robust website would choose the best hosting server for it. But it is further suggested to keep separate servers with stronger security controls for the company’s data storage. Not just that, it is even better to keep a couple of validations that need to be entered while accessing the database. And make sure this separate server would also be protected with necessary security measures. Database Segmentation A wide-open database would have great chances of vulnerability. Therefore, database segmentation according to the roles would be a decent idea to secure the data. For example, a developer can access the basic information of the table but cannot alter it until he gains permission from the team-lead. While team-lead can have the right to access sensitive data and alter the structure of the database according to the needs. Infographics
Kapil Panchal

Kapil Panchal

A passionate Technical writer and an SEO freak working as a Content Development Manager at iFour Technolab, USA. With extensive experience in IT, Services, and Product sectors, I relish writing about technology and love sharing exceptional insights on various platforms. I believe in constant learning and am passionate about being better every day.

Build Your Agile Team

Enter your e-mail address Please enter valid e-mail

Categories

Ensure your sustainable growth with our team

Talk to our experts
Sustainable
Sustainable
 
Blog Our insights
Azure DevOps Pipeline Deployment for Competitive Business: The Winning Formula
Azure DevOps Pipeline Deployment for Competitive Business: The Winning Formula

We always hear about how important it is to be competitive and stand out in the market. But as an entrepreneur, how would you truly set your business apart? Is there any way to do...

MySQL vs Azure SQL Database: Cost, Security, and Compatibility Considerations
MySQL vs Azure SQL Database: Cost, Security, and Compatibility Considerations

This blog is a continuation of MySQL vs Azure SQL Database – Part 1 , where we compared MySQL and Azure SQL databases. We learned how important it is to identify and evaluate client...

Is It Worth Using Azure With Power Platforms For Financial Business?
Is It Worth Using Azure With Power Platforms For Financial Business?

The era of traditional software development is fading; Azure Cloud and Power Platform services are taking charge to run businesses of the new age. When it comes to Financial business,...