Search

Top 60 Oracle Blogs

Recent comments

cloud

Oracle Standard Edition on AWS ☁ socket arithmetic

By Franck Pachot

.
Note that I’ve written previously about Oracle Standard Edition 2 licensing before but a few rules change. This is written in May 2020.
TL;DR: 4 vCPU count for 1 socket and 2 sockets count for 1 server wherever hyper-threading is enabled or not.

The SE2 rules

I think the Standard Edition rules are quite clear now: maximum server capacity, cluster limit, minimum NUP, and processor metric. Oracle has them in the Database Licensing guideline.

2 socket capacity per server

Oracle Database Standard Edition 2 may only be licensed on servers that have a maximum capacity of 2 sockets.

Always free / always up tmux in the Oracle Cloud with KSplice updates

By Franck Pachot

.
I used to have many VirtualBox VMs on my laptop. But now, most of my labs are in the Cloud. Easy access from everywhere.

GCP

There’s the Google Cloud free VM which is not limited in time (I still have the 11g XE I’ve created 2 years ago running there) being able to use 40% of CPU with 2GB of RAM:

AWS Certified Database Specialty (DBS-C01)

Here is my feedback after preparing and passing the AWS Database Specialty certification. There are tips about the exam but also some thoughts that came to my mind during the preparation when I had to mind-shift from a multi-purpose database system to purpose-built database services.

Oracle disables your multitenant option when you run on EC2

I have installed Oracle 19.6 on an EC2 for our Multitenant Workshop training. And of course, during the workshop we create a lot of PDBs. If you don’t have paid for the Enterprise Edition plus the Multitenant Option you can create at most 3 pluggable database. But with this option you can create up to 252 pluggable databases. Does it worth the price, which according to the public price list is USD 47,500 + 17,500 per processor, which means per-core because Oracle doesn’t count the core factor when your Intel processors are in AWS Cloud (according to the Authorized Cloud Environments paper)?

DynamoDB: adding a Local covering index to reduce the cost

By Franck Pachot

.
This is a continuation on the previous post on DynamoDB: adding a Global Covering Index to reduce the cost. I have a DynamoDB partitioned on “MyKeyPart”,”MyKeySort” and I have many queries that retrieve a small “MyIndo001” attribute. And less frequent ones needing the large “MyData001” attribute. I have created a Global Secondary Index (GSI) that covers the same key and this small attribute. Now, because the index is prefixed by the partition key, I can create a Local Secondary Index (LSI) to do the same. But there are many limitations. The first one is that I cannot add a local index afterwards. I need to define it at the table creation.

Drop table

Here I am in a lab so that I can drop and re-create the table. In real live you may have to create a new one, copy the items, synchronize it (DynamoDB Stream),…

DynamoDB: adding a Global covering index to reduce the cost

By Franck Pachot

.
People often think of indexes as a way to optimize row filtering (“get item” faster and cheaper). But indexes are also about columns (“attribute projection”) like some kind of vertical partitioning. In relational (“SQL”) databases we often add more columns to the indexed key. This is called “covering” or “including” indexes, to avoid reading the whole row. The same is true in NoSQL. I’ll show in this post how, even when an index is not required to filter the items, because the primary key partitioning is sufficient, we may have to create a secondary index to reduce the cost of partial access to the item. Here is an example with AWS DynamoDB where the cost depends on I/O throughput.

Developing Applications with CockroachDB… end-to-end example with SpringBoot, Java, Hibernate, Gradle

overview

Application development frameworks like SpringBoot have done a good job a giving developers a quick start to developing applications with Java. Layering object-relational mapping to SQL to remove proprietary syntax has further simplified the development of database applications.

CockroachDB allows developers to take advantage in the evolution of globally distributed data with a simple SQL interface. But each database can bring it’s own specific syntax and cause developers to hesitate when getting started with a new database.

How do you get started?

Oracle 20c SQL Macros: a scalar example to join agility and performance

By Franck Pachot

.
Let’s say you have a PEOPLE table with FIRST_NAME and LAST_NAME and you want, in many places of your application, to display the full name. Usually my name will be displayed as ‘Franck Pachot’ and I can simply add a virtual column to my table, or view, as: initcap(FIRST_NAME)||’ ‘||initcap(LAST_NAME). Those are simple SQL functions. No need for procedural code there, right? But, one day, the business will come with new requirements. In some countries (I’ve heard about Hungary but there are others), my name may be displayed with last name… first, like: ‘Pachot Franck’. And in some context, it may have a comma like: ‘Pachot, Franck’.

There comes a religious debate between Dev and Ops:

Demystifying JSON with CockroachDB… Import, Index, and Computed Columns

Overview

Recently, I created and delivered an "Advanced Developer Workshop" for CockroachLabs. One of the topics dove into how to ingest and use JSON data.

Like many databases, CockroachDB has the ability to use JSON data type for columns within a table. Basically, you insert a JSONB object into a row, and then can filter and extract the desired data with SQL. The following simple example shows how this is done:

So it is pretty straight forward to use JSONB objects within tables, but how do you load those HUGE json files into CockroachDB?

Running SQL Server on the Oracle Free tier

By Franck Pachot

The Oracle Cloud is not only for Oracle Database. You can create a VM running Oracle Linux with full root access to it, even in the free tier: a free VM that will be always up, never expires, with full ssh connectivity to a sudoer user, where you are able to tunnel any port. Of course, there are some limits that I’ve detailed in a previous post. But that is sufficient to run a database, given that you configure a low memory usage. For Oracle Database XE, Kamil Stawiarski mentions that you can just hack the memory test in the RPM shell script.