Server Administration :: How To Get Tables In Oracle 10g
Dec 17, 2010
I have installed the Oracle 10g software and created Database my own in RHEL 4. i got the emp tables when i run @?/rdbms/admin/utlsampl.sql with main user system. After i created another user test and i tried to get the tables. but im getting error like table not exist.
I have installed oracle 10g software and created Database on RHEL 4. when i run the run.sql its successively done but im not getting tables. whats the prob.
I have problem with moving old DB to the new (the same DB 10.2.0 in Win 2003, first in 32 bit, second in 64 bit). I want move DB from 32 to 64 bit. Problem is that all objects in old DB were created in SYSTEM schema by SYS. I can't export that objets (with data) because impdp nor imp don't touch this objects (tables with indexes). I can't use export import procedure. I'm looking for another method to transfer data, which will be the best and the fastest? Maybe files copy on OS? I suppose it will be problems with configuration files, database have other tablespaces.
I am trying to describe all the tables in a database.We use desc or describe tablename; to describe a table, but what is the command to describe all the tables in a database (i don't need the system tables)
Once i log into sqlplus as a say ABC (SID or HostString) as a user then if i do a desc table name i get column name, data type and null not null etc but i i need that for all the tables in that ABC database..
In an attempt to take older data off line and allow database refreshes to be faster, tablespaces associated with partitioned table data for a given time period was taken off line, leaving only tablespaces that relate to the current time period online. In effect, tablespaces related to 2010 and earlier were taken offline from a table.
1. Without giving a filter on the partition key (the business date) to scan for data greater than the dates in the off lined tablespace partition, we get a ORA-376/ORA-1110 error (data file cannot be read at this time).
2. Materialized views using fast refresh or refresh on commit, will also not work because of the partitions being off line.
Queries directly querying the tables are manageable from an application point of view.But the materialized views failing to aggregate is a bigger problem.
how we can manage this situation? I know that I can move the partitions to a different table in a tablespace to be taken off line. But if possible, we wanted to solve this without doing a move partition.
create a procedure or cursor to allocate extents to all tables with zero rows for all the user in the database.I have used the below query to check table with zero rows and no extents allocated.
select onwer,table_name,initial_extent from dba_tables where initial_extent is null order by owner; I generated the query to allocate extents by using concatenation in the above query. select 'ALTER TABLE '||table_name|| ' ALLOCATE EXTENT; ' from dba_tables where initial_extent is null order by owner;
now I want the extent allocation for such table auutomatically for aal the tables with zero rows.
There is a readonly user on our reporting server. Developers want to use global temporary tables with this user. I don't want the user to have permissions other than readonly.I can grant the user CREATE TABLE privilege and did not grant quota on any permanent tablespace, therefore user would not be unable to create permanent tables but still should be able to create global temporary tables.
Question is would a user with such permissions still be able to utilize temp tables as part of a scheduled job?
We want to find out difference of data for some tables between current day & previous day. We can use query with minus operation but it will take lot of time since table size is in range from 200 to 500 GB. We have to do this exercise every day.
Is it possible to create trigger on the various tables and views exists (i.e. dynamic performance views) in data dictionary, when ever any DML operations performs by Oracle it self?
Users use front end (called ESS Console) and when they try to open one of those tables they wait very long (really bad performance). Sometimes the GUI even hanging without displaying results.
Does Partitioned Tables feature works for better performance?
We are using Oracle 10g and have 10 tablespaces defined for our Database which have 108 tables. Size of 108 tables is around 251 MB as seen during importing the dump. While creating these 10 tablespaces I used below parameters for allocation of space
SIZE 1M REUSE AUTOEXTEND ON NEXT 1M MAXSIZE 1M;
which set the initial space for 10 tablespaces to around 1032Kb each. Now my Question is after importing the dump , how the disk space for 10 tablespaces increases to 398 MB in total ?
Is there any relation of Tablespace disk space and Actual Data present in the tables ?
I have Oracle 10g on an XP machine, and use the 'Oracle in OraDB10g_home1' driver to read the data. I have another Windows Server 2008 R2 machine on the same network, with SQL Server 2012 on it. What is the best way to read Oracle Tables in SQL Server? Can I setup an ODBC link from my Windows Server machine to the Oracle Database (which would require me to download an Oracle ODBC driver)? Or is the best way to export the required tables from Oracle (e.g. into csv format) and import them into SQL?
i am trying to install Oracle 10.10.2.0 on Windows Server 2003 standard x64 Edition Service Pack, but when i try to run the installer or open DVD it gives me below error.
"The image file D: is Valid, but is for a machine type other than the current machine."
I was trying to delete the database in the test server. When i was deleting listener was already stopped, i continued deleting using dbca, it shown me some alert that datafiles cant be deleted because system could't find database, since listner was stopped so only service was deleted(the one showing in the windows administrator toolsservicesOracleServiceTEST).
All the datafile parameter files are still there. How can i delete the datafiles and parameter files belongs to that database or how to create the deleted service, so that i will start the listener and do the complete deleting of the database.
I am trying to find the unix process for one of my application in the database but I am unable to view the same. To simulate, I did the following.
1. My database runs on different server. 2. I invoked "sqlplus" from another unix box to login to the database. 3. I found that the process id (ps -ef |grep sqlplus). 4. When I execute the below mentioned query it does not display the process id that I am looking for. But the osuser, username, program and machine details are correct. How can I know the process details from the database?
SELECT SYS.GV_$SESSION.OSUSER, SYS.GV_$SESSION.USERNAME, SYS.GV_$PROCESS.SPID, SYS.GV_$SESSION.MACHINE, SYS.GV_$SESSION.PROGRAM, SYS.GV_$PROCESS.PROGRAM ,SYS.GV_$SESSION.SQL_ID FROM SYS.GV_$PROCESS, SYS.GV_$SESSION WHERE SYS.GV_$PROCESS.ADDR=SYS.GV_$SESSION.PADDR and SYS.GV_$SESSION.USERNAME='TEST' and SYS.GV_$SESSION.MACHINE like '%hostname%'
I want to install Oracle 11g R2 in windows 2008 64 bit server. How can I know whether my server is ready to install Oracle ie is all components are available in server or any patch is to be applied etc.
I'm trying to connect a oracle client application on the client machine to a remote oracle server on the server machine but i get a connection fail.
On the server machine I configured oracle server in the following way:
Installed oracle server. Created a database "DB_Test" with the database configuration assistant Created a LISTNER with the Oracle NET Manager with the following parameter:
Protocol: TCP/IP HOST: server pc hostname (ENZOVAIO) or server machine address ip (192.168.0.71) in the network lan Port Number: 1521 Created "dbtest" service with the Oracle NET Manager with with the following parameter: Service Name: "dbtest" Protocol: TCP/IP HOST: server pc hostname (ENZOVAIO) or server machine address ip (192.168.0.71) in the network lan Port Number: 1521
All services on the server machine are running and I opened port number (1521) in the router. On the client machine I installed SQL PLUS and SQL Developer.
With SQL Plus as by the official documentation I have entered the following command:
CONNECT username/password@[//]host[:port][/service_name]. In my case is: CONNECT SYSTEM/oracledb@//ENZOVAIO:1521/testdb.
With SQL Developer I have entered the same parameter.
But with both SQLPlus and SQL Developer the connection fails.
We performed image copy of production Oracle server (OS and instances) to a backup server. After a few weeks, we try to restore a latest Oracle database backup from production server to backup server. As we know, Oracle instance must be unique on the network.
Even we log on to backup server and bring up the instance, I think that still point to production instance since all init file, TNSNAMES.ora and listener file are still same. If we restore the database, we will end up bring down the production instance and restore on top of productions. How to change instance name on backup server including TNSNAMES, sqlnet, listener files in order for us to restore Oracle database from production to backup server?
I recently installed Oracle 10g on my windows Xp laptop. It has become considerably slow since then. I want to start the database server only when I need it, and not every time I start my laptop. I looked around in OEM and did found a way.
I am connected as System. It was the only user I set-up a password when installed the database on personal computer.
SQL> alter user sys identified by mypass007 2 / User altered. SQL> connect sys/mypass007 ERROR: ORA-28009: connection as SYS should be as SYSDBA or SYSOPER
We will be having a meeting with our client regarding their Database Server Migration (They are planning to buy a new server). Their current database is Oracle 10gR2, they will not upgrade to 11g, they just plan to migrate to a new more powerful machine.
I was planning to ask the following questions.
1. Specifications of the current server and the new one. 2. Operation system (I think they will use same OS, just an updated one) 3. Can the business afford full downtime on current servers? 4. Size of the DB, because it can take hours to move large files.
And is there documentation regarding Server Migration (Change of machine only, not database upgrade or anything,