SQL - safely downcast BIGINT to INT
Add these to your script:
SET ARITHABORT OFF;
SET ARITHIGNORE ON;
This will convert any overflow values to NULL.
More info here: http://msdn.microsoft.com/en-us/library/ms184341.aspx
Change all SQL Server Columns From BigInt to Int
Well, I had come across this kind of problem. I had to change int
to bigint
. This is harder, but possible. It is very easy to change datatype using the following statement:
Alter table myTable alter column targetcolumn int not null
However if your columns are involved in constraint relationship then you have to drop your constraints then alter and then recreate your constraints.
Alter table myTable drop constraint [fkconstraintname]
Alter table myTable alter column targetcolumn int not null
Alter table othertable alter column targetcolumn int not null
Alter table myTable add constraint [fkconstraintname] foreign key (targetcolumn) references othertable(targetcolumn)
EDIT
If you have a lot of constraints then changing it is a real pain in the butt. If there are a lot of tables with constraints and no extreme urge at changing don't do it.
EDIT
Then you can do the following. Connect to Sql Server via Management Studio, right click on the necessary database => Tasks => Generate scripts.
Next => Next
At that point press advanced. There will be a popup. Set Type of data to script
to schema and data. Choose whatever output is comfortable for you (file, query window)? Press ok and proceed. It will produce you a complete DDL and DML, like this:
USE [master]
GO
/****** Object: Database [Zafarga] Script Date: 02/02/2012 19:31:55 ******/
CREATE DATABASE [Zafarga] ON PRIMARY
GO
ALTER DATABASE [Zafarga] SET COMPATIBILITY_LEVEL = 100
GO
IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [Zafarga].[dbo].[sp_fulltext_database] @action = 'enable'
end
GO
ALTER DATABASE [Zafarga] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [Zafarga] SET ANSI_NULLS OFF
GO
ALTER DATABASE [Zafarga] SET ANSI_PADDING OFF
GO
ALTER DATABASE [Zafarga] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [Zafarga] SET ARITHABORT OFF
GO
ALTER DATABASE [Zafarga] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [Zafarga] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [Zafarga] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [Zafarga] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [Zafarga] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [Zafarga] SET CURSOR_DEFAULT GLOBAL
GO
ALTER DATABASE [Zafarga] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [Zafarga] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [Zafarga] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [Zafarga] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [Zafarga] SET ENABLE_BROKER
GO
ALTER DATABASE [Zafarga] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [Zafarga] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [Zafarga] SET TRUSTWORTHY OFF
GO
ALTER DATABASE [Zafarga] SET ALLOW_SNAPSHOT_ISOLATION OFF
GO
ALTER DATABASE [Zafarga] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [Zafarga] SET READ_COMMITTED_SNAPSHOT OFF
GO
ALTER DATABASE [Zafarga] SET HONOR_BROKER_PRIORITY OFF
GO
ALTER DATABASE [Zafarga] SET READ_WRITE
GO
ALTER DATABASE [Zafarga] SET RECOVERY FULL
GO
ALTER DATABASE [Zafarga] SET MULTI_USER
GO
ALTER DATABASE [Zafarga] SET PAGE_VERIFY CHECKSUM
GO
ALTER DATABASE [Zafarga] SET DB_CHAINING OFF
GO
EXEC sys.sp_db_vardecimal_storage_format N'Zafarga', N'ON'
GO
USE [Zafarga]
GO
/****** Object: Table [dbo].[Category] Script Date: 02/02/2012 19:31:56 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Category](
[CategoryId] [bigint] IDENTITY(1,1) NOT NULL,
[CategoryName] [nvarchar](max) NULL,
PRIMARY KEY CLUSTERED
(
[CategoryId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
/****** Object: Table [dbo].[Product] Script Date: 02/02/2012 19:31:56 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Product](
[ProductId] [bigint] IDENTITY(1,1) NOT NULL,
[Name] [nvarchar](max) NULL,
[Price] [decimal](18, 2) NOT NULL,
[CategoryId] [bigint] NOT NULL,
PRIMARY KEY CLUSTERED
(
[ProductId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
/****** Object: ForeignKey [Category_Products] Script Date: 02/02/2012 19:31:56 ******/
ALTER TABLE [dbo].[Product] WITH CHECK ADD CONSTRAINT [Category_Products] FOREIGN KEY([CategoryId])
REFERENCES [dbo].[Category] ([CategoryId])
ON DELETE CASCADE
GO
ALTER TABLE [dbo].[Product] CHECK CONSTRAINT [Category_Products]
GO
Change all your datatypes appropriately, then run.
As you said all your data is below 5000 rows. So there is no need to modify insert statements.
Be ready it will take a long time.
Hope this was useful.
EDIT
This will generate you a new database, so be ready to rename your original or newly created db.
SQL split bigint into two int
SQL Server is treating and displaying your bytes as signed integers. The data in binary is still correct.
About half of the time those bytes representing the least significant bytes of your numbers will fall under the negative range of a four-byte signed int. You'd have the same scenario with negative values being split: sometimes the lower half will be positive.
You might think of it this way. After splitting into two values you've now got two sign bits (the leading bits) rather than just the one you started out with.
This might work to get back the unsigned value you expected to see:
cast(0x00000000 + cast(@RightHalf as binary(4)) as bigint)
It might be even easier to just say as below which will retain the bigint type:
@LeftHalf = @BigintDataLimitIn / 4294967296
@RightHalf = @BigintDataLimitIn % 4294967296
Error converting data type bigint to int [SQL Server]
Use varchar(10) or char(10) instead of bigint.
Casting BigInt to Int in Spark
Casting is like changing "the glasses" your code use to represent what is referenced by your value and not actually changing the referenced content nor changing the reference to point to a new BigInt
instance.
That implies that you need to get your value with the type it really has and then build a BigInt
instance from it:
BigInt(row.getAs[Long](0))
Following the same reasoning, you can create an Int
instance from the Long
as follows:
row.getAs[Long](0).toInt
But it might overflow the integer type representation range.
Is it wise to convert from BigInteger to int?
It is definitely not safe to convert a BigInteger to an int, as you would be vulnerable to overflow.
In order to get around this, you would have to put in a condition, that makes sure that you will not convert BigIntegers outside the ints range, which is:
Integer.MAX_VALUE = 2147483647
Integer.MIN_VALUE = -2147483648
You could consider if long is sufficient for you use though. It has significantly wider bounds:
Long.MAX_VALUE = 9223372036854775807
Long.MIN_VALUE = -9223372036854775808
CAST and IsNumeric
IsNumeric returns 1 if the varchar value can be converted to ANY number type. This includes int, bigint, decimal, numeric, real & float.
Scientific notation could be causing you a problem. For example:
Declare @Temp Table(Data VarChar(20))
Insert Into @Temp Values(NULL)
Insert Into @Temp Values('1')
Insert Into @Temp Values('1e4')
Insert Into @Temp Values('Not a number')
Select Cast(Data as bigint)
From @Temp
Where IsNumeric(Data) = 1 And Data Is Not NULL
There is a trick you can use with IsNumeric so that it returns 0 for numbers with scientific notation. You can apply a similar trick to prevent decimal values.
IsNumeric(YourColumn + 'e0')
IsNumeric(YourColumn + '.0e0')
Try it out.
SELECT CAST(myVarcharColumn AS bigint)
FROM myTable
WHERE IsNumeric(myVarcharColumn + '.0e0') = 1 AND myVarcharColumn IS NOT NULL
GROUP BY myVarcharColumn
max value represented by bigint
See the answer provided in this similar question. There is no way, as far as I know, to programmatically find the answer you're looking for.
Based on the comments you posted on another answer, this would allow you to only have to change your values in one place, as opposed to multiple places.
Converting a FLOAT to an INT
Rather CAST
or Convert
to BIGINT
, as your number is to large for int. See int, bigint, smallint, and tinyint
bigint
Integer (whole number) data from -2^63 (-9,223,372,036,854,775,808) through 2^63-1
(9,223,372,036,854,775,807). Storage size is 8 bytes.int
Integer (whole number) data from -2^31 (-2,147,483,648) through 2^31 - 1 (2,147,483,647). Storage size is 4 bytes. The SQL-92
synonym for int is integer.
Cast bigint to long
If this is MySql, you should probably use java.math.BigDecimal
.
See the table at Java, JDBC and MySQL Types.
Related Topics
Postgresql - Replace HTML Entities
What's the Most Efficient Way to Normalize Text from Column into a Table
How to Update a Blob in SQL Server Using Tsql
How to Use User Defined Table Type Inside Another User Defined Table Type in SQL
Sqlite Equivalent of Postgresql's Greatest Function
Performance Tuning on Inner Join with Between Condition
Search If Number Is Contained Within an Expression Like: 1-3,5,10-15,20
Is It Better to Do an Equi Join in the from Clause or Where Clause
Sqlite: How to Select "Most Recent Record for Each User" from Single Table with Composite Key
Group by SQL Query on Comma Joined Column
Find Most Common Elements in Array with a Group By
Credentials Error When Integrating Google Drive With
This SQL 'Order By' Is Not Working Properly
Find Only Capital Letters in Word Through in SQL Server Query