My google-fu is on the fritz today. Anyone have links for documentation of R integration with IQ?
Looking for documentation of R integration with IQ
sybase IQ license check
Hi Team,
Is there a stored procedure /system view / system table , Where we can get IQ information like validity of license .
I am using un severed licensing mode .
Thanks,
Razal
Sybase IQ Point in time recovery backup
Hi ,
I was trying to Setup Sybase IQ Point in time recovery log backup as per the Admin guide in SP08 .
As per the guide , We should enable it like as below:
1. SET OPTION PUBLIC.IQ_POINT_IN_TIME_RECOVERY_LOGGING = 'ON'
2. ALTER DBSPACE IQ_SYSTEM_LOG RENAME '/My Directory/<prefix> '
3. BACKUP DATABASE FULL to '/demo/dataBackup/FULL1'
After 2nd step , DB is getting crashed with below error :
-----------------
Error! The connection to the database was closed by the server.
Communication error
SQLCODE=-85, ODBC 3 State="08S01"
Line 1, column 1
ALTER DBSPACE IQ_SYSTEM_LOG RENAME My Directory/<prefix>
--------------------
I tried for creation IQ_SYSTEM_LOG but says the
"CREATE DBSPACE IQ_SYSTEM_LOG AS ' My Directory/<prefix>"
Could not execute statement.
Item 'IQ_SYSTEM_LOG' already exists
SQLCODE=-110, ODBC 3 State="42S01"
Line 1, column 1
CREATE DBSPACE IQ_SYSTEM_LOG AS '/ My Directory/<prefix>'
----------------------------------------
I tried the following SQL to check current DBSPACES
select * from SYSDBSPACE but IQ_SYSTEM_LOG is not listed
-----------------------
Can someone please help me here for the same and share best practices, So that i can enable automatic backups .
Thanks,
Razal
Trouble with iqunload and identity columns
Hello,
We are in the process of upgrading from IQ 12.7 to IQ 15.4(GA) using iqunload 15.4.0.6567( GA) utility
When running iqunload in the reload schema phase we get the following error:
The database reload failed with the following error:
***** SQL error: IQ Internal error. Please report this to Sybase IQ support.
-- (db_table.cxx 2446)
This file contains the statement that caused the schema reload failure.
To complete the schema reload, you need to modify your database to avoid generating the statement below.
Once you have modified your database appropriately, re-run the schema reload process.
NOTE: You may want to generate the schema only for the database being reloaded,
and load this schema into an empty database to check for sql errors:
SET OPTION "CORP_MSTR_DES_VIS"."identity_insert"='CORP_MSTR_DES_VIS.DIM_ITEM_CLASS'
Q: Should we switch off the option identity_insert to public before running the command ?
We tried altering th table ans setting default to null for that column but it failed..
Anyway the table looks like this:
CREATE TABLE "CORP_MSTR_DES_VIS"."DIM_ITEM_CLASS" (
"ITEM_CLASS_CD" varchar(255) NOT NULL
,"ITEM_CLASS_DESC" varchar(50) NOT NULL
,"CRM_ITEM_CLASS_ID" unsigned int NOT NULL DEFAULT autoincrement
Thank you
Regards
JMT
SA CR 728597 / Linux Kernel direct i/o bug & huge pages
Last year, April -> October, I asked the question about IQ supporting Huge Pages on Linux. It was mentioned that under SA CR 728597 and Red Hat Bug 891857 that there was a bug in the Linux kernel handling of direct I/O while using transparent huge memory pages (a variant of Linux Huge memory pages).
CR 728597:
This problem is related to a possible bug in the transparent huge pages (THP) feature introduced in these operating system versions. Red Hat bug 891857 has been created to track this issue.
The problem can be triggered by calling an external environment, xp_cmdshell, or other procedure that causes a fork while other I/O is occurring. A known limitation with the Linux kernel limits the use of fork while doing O_DIRECT I/O operations. Essentially what can happen is that the data can come from or go to the wrong process’ memory after the fork. SQL Anywhere performs O_DIRECT I/O operations according to the documented safe usage. However, THP appears to cause further problems and the O_DIRECT I/O data comprising database page reads/writes appears to get lost.
http://scn.sap.com/thread/3338917 and http://froebe.net/blog/2013/06/17/does-anyone-have-any-details-on-redhat-linux-bug-891857/
Does anyone know the status of this ongoing FIVE year old issue?
jason
Catalog cache usage behavior in IQ 15.4
Hello,
Question about catalog cache usage behavior.
We defined 3GB for IQ catalog cache using configuration options (-ch).
We are measuring IQ catalog cache usage using the following command:
select property('CurrentCacheSize')
We see every day increase in Cache usage of about 1%.
My question is:
What is the expected behavior? Will the cache usage reach certain point and stop?
Our IQ version is:
Sybase IQ/15.4.0.3019/120816/P/ESD 2/Enterprise Linux64 - x86_64 - 2.6.18-194.el5/64bit/2012-08-16 10:48:47
Thanks in advance
Mark
Where is Sybase Central for IQ?
In the past it was located in the PC client bundle IIRC. On subscribenet, I'm only seeing the network client for win32, which doesn't include sybase central.
Please, I'm not talking about Sybase Control Center and am not interested in that until it is rewritten.
How Cartesian products affects Sybase IQ performance?
In my IQ message log, I have the following message:
I. 04/21 08:28:22. 0001847919 Exception Thrown from dfo_Root.cxx:822, Err# 0, tid 8 origtid 8
I. 04/21 08:28:22. 0001847919 O/S Err#: 0, ErrID: 9216 (df_Exception); SQLCode: -1005015, SQLState: 'QTA15', Severity: 14
I. 04/21 08:28:22. 0001847919 [20169]: The optimizer was unable to find a query plan that avoided cartesian product joins larger than the Max_Cartesian_Result setting
-- (dfo_Root.cxx 822)
I don´t know if
this situation may affect the overall performance
Thank you
Identity_insert
Hi all,
I am trying to import data from CSV into a table through Interactive SQL and I see this below error. Please advise
Sybase IQ usage
Experts,
Greetings,
We have multiple SAP products like ERP, BW , PI , EP , CRM etc… and Sybase ASE 15.7.0.042 is used as database and SCC 3.2.8 to manage it. going forward will setting up DR as well. .
Confusion is where does Sybase IQ fit our environment , it is specifically for BW
If possible can you please explain
Regards
Mohammad
IQ 16 SP8, Question on Row Id
SAP IQ does not guaranty that row ids will be sequential at the storage level for a table (gap can exists).
But, Can new rows inserted in the same table use the row ids in the gaps?
Example:
If I insert in a table 37,000 rows and the rows ids are from 1 to 8000 and from 600,001 to 628,000 (37,000 rows).
And later I insert (add) another 40,000 rows to the same table:
Does IQ guaranty that every new row inserted will be assigned a row id bigger that the last row ID recorded in the table?
Or, can the numbers in the gaps of the existing row ids be assigned to new rows added (ie.: If I add 40,000 new rows, can the new rows ids (or at least some row ids) be in the range 8,001 to 600,000)?
Thanks a lot,
Uvernes
Are materialized views available with IQ16
I had some issues creating a materialized view like this:
CREATE MATERIALIZED VIEW "DBA"."myView" IN "IQ_SYSTEM_MAIN" AS dateceiling(hh, timestampstart) as timestampstart_day, dateceiling(dd, timestampstart) as timestampstart_hour, datefloor(hh, timestampend) as timestampend_hour, datefloor(dd, timestampend) as timestampend_day from myTable;
Part of error message is
Materialized view definition must not use the following construct: 'Remote object'
Having a look onto SCN I found, that IQ15 was not supporting m. View. But on documentation for IQ16 I found several references. E.g. at SyBooks Online
Is IQ16 supporting materilized views and if so, what's wrong at my example SQL?
How to rename a table in Sybase IQ
How to rename a table in Sybase IQ.
i have a select comand like...
select * from "N"."tablename" (WHAT IS N means in the query, i need to change table name to tablename_1)
Backing up Sybase IQ DB
I am a backup admin trying to establish a strategy to backup Sybase IQ.
Currently we use Data Domain appliances for most of our backups. My understanding is that the Sybase IQ backup utility does some compression. Wondering if anyone is using DD either as a primary backup target for the DB backups (ie., via a DD NFS mount on the DB server), or as a secondary backup target (ie., backup to local disk, and have backup app sweep the filesystem and backup to DD).? And if so, are you seeing any reasonable dedupe on the Data Domain appliance?
Thank you.
SAP IQ 16 remote procedure to MsSQL 2005 error.
Hi all,
I've in IQ a remote procedure that call via RPC an old MsSQL 2005 stored procedure.
In my old installation with IQ 12.7 ESD 4 all works well and I don't report any issues.
Actually with the new IQ 16.0 SP08 I've the following strange behaviours.
ERROR 1) When I call the remote procedure from IQ using SAP Interactive SQL all works well if I run the client from a remote host. I mean from an host which is not the server where IQ engine is running. If I execute the same call (always from SAP Interactive SQL) on the IQ engine server the following error is reported:
There was an error reading the results of the SQL statement.
The displayed results may be incorrect or incomplete.
Cursor not in a valid state
SQLCODE=-853, ODBC 3 State="24000"
Cursor not in a valid state
SQLCODE=-853, ODBC 3 State="24000"
Server 'SVR_PRODOTTI': [Microsoft][ODBC SQL Server Driver]Function
sequence error
SQLCODE=-660, ODBC 3 State="HY000"
ERROR 2) I've the following scenario:
a. I call a remote procedure (always the MsSQL 2005 remote procedure described above) inside a standard IQ stored procedure
b. the remote procedure fill a remote table with some data
c. in IQ after that the remote procedure is executed I want to get the produced data in IQ via proxy table
Looking at MsSQL side I note that step (a) and (b) are executed, but when step (c) start running it remains blocked by the session that had run step (a and b) even if they are already finished. Look at the following extraction of session state by MsSQL side.
SPID StatusLogin HostName BlkBy DBName Command CPUTime DiskIO LastBatch ProgramName SPID REQUESTID
177 SUSPENDED usr IQ_SERVER 287 masterEXECUTE 0 0 06/19 17:31:38 Sybase IQ 177 0
287 sleeping usr IQ_SERVER . master AWAITING COMMAND 9829 153149 06/19 17:31:38 Sybase IQ 287 0
- SPID = 287 (remote procedure call)
RPC Event 0 sp_remote_procedure;1
- SPID = 177 (proxy table access)
Language Event 0 SELECT .... FROM .... t1
Environment details:
IQ -> SAP IQ 16.0 SP08 on Win 2012 R2 server (64 bit)
MSSQL -> MsSQL 2005 SP 2 32 bit Enterprise on Win 2003 server (32 bit)
Do you have any suggestions about this strange behaviours?
Best regards,
Stefano
SCPU(core) to disk I/O ratio
Good Day,
Question regarding CPU to disk I/o relation:
1) Initally we had two SAP IQ 12.7 instances runniing on a Linux x64 Red Hat 5.5 machine with 16 cores / 256Gb RAM
2) Now we moved to SAP Iq 15.4 on the same macine but leaving just this one instance with same # of cores / RAM available
This instance is heavily used in termos of dat load , query and reporting during teh wor day.
Regarding to CPU to disk I/O ratio, is it valid or not to assume the follwing:
If we bumped up the number of cores, the numer of of disk I/O request will also will be increased, assuming the same worload and number of users connected?
There is an old EMC Symmetrix storage involved here , which wasn't upgraded
Thank you
Regards
IQ 16 not supported Windows Server 2012 ?
Hi ,
I tried to install IQ16 to Windows Server 2012 box . But I got an error during install , about 'LAX' error .
Windows Server 2012 is not supported ? My environment problem ?
If not supported , do you have any plans ?
Regards,
Jim.
SAP IQ CPU(core) to disk I/O ratio
Good Day,
Question regarding CPU to disk I/O relation:
1) Initally we had two SAP IQ 12.7 instances runniing on a Linux x64 Red Hat 5.5 machine with 16 cores / 256Gb RAM
2) Now we moved to SAP Iq 15.4 on the same macine but leaving just this one instance with same # of cores / RAM available
This instance is heavily used in terms of data load , query and reporting during the day.
Regarding to CPU to disk I/O ratio, is it valid or not to assume the follwing:
If we bumped up the number of cores( now 16 available to IQ), the number of of disk I/O request will also will be increased, assuming the same workload and number of users connected?
There is an old EMC Symmetrix storage involved here , which wasn't upgraded
Thank you
Regards
Wanted: ability to export/import a schema
Currently there is no easy way to extract and restore objects, permissions, data, etc for a particular schema.
Say you needed to extract a schema to be restored on to multiple iq instances (prod -> regression and testing and development). Doing a backup/restore of the database is not practical as the schema may be only a few hundred megabytes while the database is terabytes in size plus there is the whole mess of adding removing users due to the different environment.
In Oracle, we can do this very easily with:
export schemas
expdp system/tiger@db11g schemas=SCOTT,USER1,USER2 directory=TEST_DIR dumpfile=schemas.dmp logfile=expdp.log
import only USER1 schema
impdp system/tiger@db11g schemas=USER1 directory=TEST_DIR dumpfile=schemas.dmp logfile=impdp.log
Surely the smart people in SAP Sybase can create such a feature without too much trouble.
Including link to datapump. Oracle's had this for more than a decade:
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php
IQ message log questions
I’m curious about:
- Why SAP IQ records so many transactions in the iqmsg event if there is no activity?
- Why the connection ID increment event if no user is connecting?
- How do you associate a given transaction with it associated commit transaction?
Thanks for your help.