Translate

Wednesday, February 27, 2013

Nuo DB

NuoDB is the world’s first and only patented, elastically-scalable, SQL database built for decentralized computing resources. We started with a blank slate to design a brand new kind of database that has all of the features you want (like elastic scalability on the cloud with 100% ACID guarantees and SQL compliance) – and none of the things you don’t (like complex database administration tasks involving sharding, caching clustering and performance tuning).

image

The result is a brand new kind of database that’s optimized to run and scale on the cloud. Built for you, modernized in the new age.

Download the free version of NuoDB Pro today and see what the industry’s only high-performance, emergent database can do for your app!

  • 100% SQL compliance
  • 100% ACID
  • 100% free forever
  • Performance equal to or better than open source solutions like MySQL
  • Deploy in the cloud or on-premises
  • Supports Windows, MacOS, Linux, Solaris, and Joyent SmartOS
  • ODBC, JDBC, JRuby ActiveRecord, PHP/PDO, and Hibernate drivers
  • Simple, easy to use, zero admin

February 2013 - Happenings

Just a little update on the meet-up from last week. We hosted three sessions and had a very high participation of 63. Here are a few links you would find interesting:

Summary of the meet-up | Gallery | Slide decks and sample code downloads

Tuesday, February 26, 2013

Corrupted Log File

I received a call from one of my friends saying that he is unable to access his database which is said to be one of the most critical database in the organization.

When he tried to access the database he is getting following error message.

Database ‘ABCD’ cannot be opened due to inaccessible files or insufficient memory or disk space. See the SQL Server errorlog for details.

When I asked few question it seems like his log drive was filled to it’s 100% and log got corrupted. Since he didn’t’ have any recent backups , I was little reluctant to delete the log fill and attached the data file. I kept that option as my last resort.

So I asked him to run following statements.

SELECT name,state,state_desc FROM sys.databases 
WHERE  name='ABCD' 

Output for state_desc was RECOVERY_PENDING.

I instructed him to set database to the emergency mode.

ALTER DATABASE ABCD SET EMERGENCY 

Then, set the database to single use mode and started the rollback.

ALTER DATABASE ABCD SET SINGLE_USER 
WITH ROLLBACK IMMEDIATE

Then rebuilt the log by ,

DBCC CHECKDB('ABCD',REPAIR_ALLOW_DATA_LOSS)


Then executed the initial statement and verified that database is ONLINE.


Below execution is to make that every use can log into the database.

ALTER DATABASE ABCD SET MULTI_USER

With this, users were able to login to the database without any issues.

Monday, February 25, 2013

Ignore Duplicates in Oracle

When I wrote post about Ignore duplicates in SQL Server, I got the following comment from one of the LinkedIN group.

Also, Oracle has a great way to handle this in 10g and beyond.
By James Williams
@LinkedIN group SQL Server DBA

I thought looking into this with my very limited knowledge in Oracle.

I used Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 version.

First let me create the table with the unique constraints.

image

Let me insert three rows to the table.

image

Now the question is how oracle is managing duplicate inserts for unique constraints.

image

The above statement create an error logging table using the DBMS_ERRLOG package, and one of the inserts violates the check constraint on CITY, and that row can be seen in city_errors. If more than ten errors had occurred, the statement would have aborted, rolling back any insertions made:

Let’s say there is a another table of data (tempCity) and we want to populate CITY with tempCity.

image

If you simply insert tempCity into CITY entire transaction will be failed since Denver city already exists.

However, you can insert using following syntax.

image

So you can see only two rows will be inserted while other duplicate row is ignored.

by analyzing city_errors table you will be able to get which records were failed with the reason.

image

Saturday, February 23, 2013

Ignore Duplicates

Today I was asked about situation where entire data insertion is failing due to violation of Unique Index. In this scenario, actually they need to ignore the duplicate data and continue with the other inserts.

  1:  CREATE TABLE Name (ID INT IDENTITY PRIMARY KEY CLUSTERED,
  2:  FirstName VARCHAR(100),
  3:  LastName VARCHAR(100))
  4:  GO
  5: 
  6:  CREATE UNIQUE NONCLUSTERED INDEX [IX_Name_FirstName_LastName]
  7:  ON [dbo].Name
  8:  (
  9:  [FirstName] ASC,
 10:  [LastName] ASC
 11:  ) ON [PRIMARY]
 12:  GO

So above script will create a table and unique index on that table.


Lets insert two rows for this table.

  1:  INSERT INTO Name
  2:  VALUES
  3:  ('John','Samuels'),
  4:  ('Steve','John')


Since the above two records does not violate the unique key constraints you will not find any difficulties with the above query and you will have following records in the table.


image


Then we will insert this set of rows. Highlighted row is the row which will violate the unique constraint. 

  1:   INSERT INTO Name
  2:  VALUES
  3:   ('Steve','Smith'),
  4:  ('John','Samuels'),
  5:  ('Steve','Rodes'),
  6:  ('Phill','Samuels')

As expected this will fail with the following error


Msg 2601, Level 14, State 1, Line 1
Cannot insert duplicate key row in object 'dbo.Name' with unique index 'IX_Name_FirstName_LastName'.
The duplicate key value is (John, Samuels).
The statement has been terminated.


If you query the table you will see that not only the duplicate row but other rows are not in the table.


So the solution is to use IGNORE DUPLICATE option.This is how you create the constraints. Option is highlighted. By default this option is OFF.

  1:  CREATE UNIQUE NONCLUSTERED INDEX [IX_Name_FirstName_LastName]
  2:  ON [dbo].Name
  3:  (
  4:  [FirstName] ASC,
  5:  [LastName] ASC
  6:  ) WITH (IGNORE_DUP_KEY = ON) ON  [PRIMARY]
  7:  GO

When you inserts same data set again, no errors will be generated but this message.


Duplicate key was ignored.


(3 row(s) affected)


So, duplicate record was ignored and no errors were generated. 

Saturday, February 16, 2013

Temp Tables Vs Table Variables

I have posted couple of blog post on the above topic which are very popular.

Those blog posts were,

Temp Tables Vs Table Variables Vs CTE

What's the difference between a temp table and table variable in SQL Server-

Today I came across with nice picture which explains every thing from http://sqlserverplanet.com

image

Isn’t this very simple and easy to understand?

Sunday, February 10, 2013

Is SET ROWCOUNT a Depreciated Feature?

I have read in some blog post which are saying the SET ROWCOUNT feature is a depreciated feature in SQL Server 2012.

This is half truth.

SET ROWCOUNT for INSERT, UPDATE, and DELETE statements will be depreciated but not for SELECT. 

If you want to us INSERT / UPDATE / DELETE with SET ROWCOUNT, you need to use TOP key word.

ROWCOUNT & TOP

`Just came across with this scenario. What if you are using both ROWCOUNT & TOP together in one T-SQL statement.

  1: DECLARE @rowcnt int = 100
  2: SET ROWCOUNT @rowcnt
  3: 
  4: SELECT Top 105 * FROM Sales.SalesOrderHeader

 


This will return 100 rows and easily you can come to a conclusion that ROWCOUNT will override TOP. Well, let us examine another scenario.

  1: 
  2: DECLARE @rowcnt int = 100
  3: SET ROWCOUNT @rowcnt
  4: 
  5: SELECT Top 5 * FROM Sales.SalesOrderHeader
  6: 

This will return only 5!!.


So the rule is,


SET ROWCOUNT overrides the SELECT statement TOP keyword if the ROWCOUNT is the smaller value.

Saturday, February 2, 2013

SQL Server Disaster Recovery poster

You need to be registered with SQL Server Magazine to download the poster written by Paul Randel as a free PDF and you’ll also be sent one in the mail. The poster has links to a page with more explanations on our website.

Check it out at http://www.sqlmag.com/whitepaper/sql-server/sql-server-disaster-recovery-step-step-145134

Something to note,

image