Commit 3d8acd39 by Konstantin Käfer

Merge branch 'master' of github.com:developmentseed/node-sqlite3

Conflicts:
	package.json
parents f95d39b8 ffdc924b
......@@ -5,6 +5,8 @@
*.lo
*.Makefile
*.target.gyp.mk
stage
lib/binding
build
out
Release
......@@ -16,5 +18,6 @@ gyp-mac-tool
.dirstamp
npm-debug.log
test/support/big.db
lib/node_sqlite3.node
test/tmp
.DS_Store
\ No newline at end of file
language: node_js
node_js:
- "0.10"
- "0.8"
- "0.6"
install:
- npm install
- npm install mocha
- npm test
before_script: "make clean"
script:
- npm install --stage
- npm test
# Changlog
## 2.1.15
Released August 7th, 2013
- Minor readme additions and code optimizations
build:
node-gyp build
npm install --build-from-source
clean:
rm test/support/big.db*
rm test/tmp/*
node-gyp clean
db:
@if ! [ -f test/support/big.db ]; then \
echo "Creating test database... This may take several minutes." ; \
node test/support/createdb.js ; \
fi
rm -f ./lib/node_sqlite3.node
rm -rf ./lib/binding/
#rm -f ./test/support/big.db*
rm -f ./test/tmp/*
rm -rf ./deps/sqlite-autoconf-*/
rm -rf ./build
rm -rf ./out
test: build db
test:
npm test
.PHONY: build clean test
# NAME
node-sqlite3 - Asynchronous, non-blocking [SQLite3](http://sqlite.org/) bindings for [node.js](https://github.com/joyent/node) 0.2-0.4 (versions 2.0.x), **0.6.13+ and 0.8.x** (versions 2.1.x).
node-sqlite3 - Asynchronous, non-blocking [SQLite3](http://sqlite.org/) bindings for [Node.js](http://nodejs.org/) 0.2-0.4 (versions 2.0.x), **0.6.13+, 0.8.x, and 0.10.x** (versions 2.1.x).
(Can also run in [node-webkit](https://github.com/rogerwang/node-webkit) if it uses a supported version of Node's engine.)
[![Build Status](https://travis-ci.org/developmentseed/node-sqlite3.png?branch=master)](https://travis-ci.org/developmentseed/node-sqlite3)
[![npm package version](https://badge.fury.io/js/sqlite3.png)](https://npmjs.org/package/sqlite3)
# USAGE
Install with `npm install sqlite3`.
**Note:**   the module must be [installed](#installing) before use.
``` js
var sqlite3 = require('sqlite3').verbose();
......@@ -48,33 +52,81 @@ db.close();
See the [API documentation](https://github.com/developmentseed/node-sqlite3/wiki) in the wiki.
# BUILDING
# INSTALLING
You can use [`npm`](https://github.com/isaacs/npm) to download and install:
* The latest `sqlite3` package: `npm install sqlite3`
* GitHub's `master` branch: `npm install https://github.com/developmentseed/node-sqlite3/tarball/master`
In both cases the module is automatically built with npm's internal version of `node-gyp`,
and thus your system must meet [node-gyp's requirements](https://github.com/TooTallNate/node-gyp#installation).
It is also possible to make your own build of `sqlite3` from its source instead of its npm package ([see below](#building-from-the-source)).
It is possible to use the installed package in [node-webkit](https://github.com/rogerwang/node-webkit) instead of the vanilla Node.js, but a rebuild is required before use (see the next section).
# REBUILDING FOR NODE-WEBKIT
Because of ABI differences, only a rebuilt version of `sqlite3` can be used in [node-webkit](https://github.com/rogerwang/node-webkit).
After the `sqlite3` module is installed (according to the previous section), do the following:
1. Install [`nw-gyp`](https://github.com/rogerwang/nw-gyp) globally: `npm install nw-gyp -g` *(unless already installed)*
2. Use `nw-gyp` to rebuild the module: `nw-gyp rebuild --target=0.6.2`
Make sure you have the sources for `sqlite3` installed. Mac OS X ships with these by default. If you don't have them installed, install the `-dev` package with your package manager, e.g. `apt-get install libsqlite3-dev` for Debian/Ubuntu. Make sure that you have at least `libsqlite3` >= 3.6.
Remember the following:
Bulding also requires node-gyp to be installed. You can do this with npm:
* In the `nw-gyp rebuild` command, specify the actual target version of your node-webkit. The command must be run in sqlite3's directory (where its `package.json` resides).
npm install -g node-gyp
* After the `sqlite3` package is rebuilt for node-webkit it cannot run in the vanilla Node.js (and vice versa).
* For example, `npm test` of the node-webkit's package would fail.
* If you need `sqlite3` package both for Node.js and node-webkit, then you should make two separate installations of `sqlite3` (in different directories) and rebuild only one of them for node-webkit.
To obtain and build the bindings:
Visit the “[Using Node modules](https://github.com/rogerwang/node-webkit/wiki/Using-Node-modules)” article in the node-webkit's wiki for more details.
git clone git://github.com/developmentseed/node-sqlite3.git
cd node-sqlite3
./configure
# BUILDING FROM THE SOURCE
Unless building via `npm install` (which uses its own `node-gyp`) you will need `node-gyp` installed globally:
npm install node-gyp -g
The sqlite3 module depends only on libsqlite3. However, by default, an internal/bundled copy of sqlite will be built and statically linked, so an externally installed sqlite3 is not required.
If you wish to install against an external sqlite then you need to pass the `--sqlite` argument to `node-gyp`, `npm install` or the `configure` wrapper.
./configure --sqlite=/usr/local
make
You can also use [`npm`](https://github.com/isaacs/npm) to download and install them:
Or, using the node-gyp directly:
npm install sqlite3
node-gyp --sqlite=/usr/local
make
Or, using npm:
npm install --sqlite=/usr/local
If building against an external sqlite3 make sure to have the development headers available. Mac OS X ships with these by default. If you don't have them installed, install the `-dev` package with your package manager, e.g. `apt-get install libsqlite3-dev` for Debian/Ubuntu. Make sure that you have at least `libsqlite3` >= 3.6.
Note, if building against homebrew-installed sqlite on OS X you can do:
./configure --sqlite=/usr/local/opt/sqlite/
make
# TESTING
# TESTS
[mocha](https://github.com/visionmedia/mocha) is required to run unit tests.
[expresso](https://github.com/visionmedia/expresso) is required to run unit tests.
In sqlite3's directory (where its `package.json` resides) run the following:
npm install expresso
make test
npm install mocha
npm test
......@@ -92,6 +144,7 @@ You can also use [`npm`](https://github.com/isaacs/npm) to download and install
* [Carter Thaxton](https://github.com/carter-thaxton)
* [Audrius Kažukauskas](https://github.com/audriusk)
* [Johannes Schauer](https://github.com/pyneo)
* [Mithgol](https://github.com/Mithgol)
......
{
'includes': [ 'deps/common-sqlite.gypi' ],
'variables': {
'sqlite%':'internal',
},
'targets': [
{
'target_name': 'node_sqlite3',
'conditions': [
['sqlite != "internal"', {
'libraries': [
'-L<@(sqlite)/lib',
'-lsqlite3'
],
'include_dirs': [ '<@(sqlite)/include' ]
},
{
'dependencies': [
'deps/sqlite3.gyp:sqlite3'
]
}
]
],
'sources': [
'src/database.cc',
'src/node_sqlite3.cc',
'src/statement.cc'
],
'dependencies': [
'deps/sqlite3/binding.gyp:sqlite3'
]
}
]
}
var https = require("https");
var http = require("http");
var fs = require("fs");
var url = require('url');
var semver = require('semver');
var cross = {};
// https://github.com/developmentseed/node-sqlite3/wiki/Binaries
var template = 'https://raw.github.com/joyent/node/v{VERSION}/src/';
var sortObjectByKey = function(obj){
var keys = [];
var sorted_obj = {};
for(var key in obj){
if(obj.hasOwnProperty(key)){
keys.push(key);
}
}
// sort keys
keys.sort(function(a,b) {
if (semver.gt(a, b)) {
return 1
}
return -1;
});
len = keys.length;
for (i = 0; i < len; i++)
{
key = keys[i];
sorted_obj[key] = obj[key];
}
return sorted_obj;
};
function get(ver,callback) {
var header = 'node.h';
if (semver.gt(ver, 'v0.11.4')) {
// https://github.com/joyent/node/commit/44ed42bd971d58b294222d983cfe2908e021fb5d#src/node_version.h
header = 'node_version.h';
}
var path = template.replace('{VERSION}',ver) + header;
var uri = url.parse(path);
https.get(uri, function(res) {
if (res.statusCode != 200) {
throw new Error("server returned " + res.statusCode + ' for: ' + path);
}
res.setEncoding('utf8');
var body = '';
res.on('data', function (chunk) {
body += chunk;
});
res.on('end',function(err) {
var term = 'define NODE_MODULE_VERSION'
var idx = body.indexOf(term);
var following = body.slice(idx);
var end = following.indexOf('\n');
var value = following.slice(term.length,end).trim();
if (value[0] === '(' && value[value.length-1] == ')') {
value = value.slice(1,value.length-1);
} else if (value.indexOf(' ') > -1) {
value = value.slice(0,value.indexOf(' '));
}
var int_val = +value;
cross[ver] = int_val;
return callback(null,ver,int_val);
})
});
}
process.on('exit', function(err) {
var sorted = sortObjectByKey(cross);
console.log(sorted);
})
var versions_doc = 'http://nodejs.org/dist/npm-versions.txt';
http.get(url.parse(versions_doc), function(res) {
if (res.statusCode != 200) {
throw new Error("server returned " + res.statusCode + ' for: ' + versions_doc);
}
res.setEncoding('utf8');
var body = '';
res.on('data', function (chunk) {
body += chunk;
});
res.on('end',function(err) {
var lines = body.split('\n').map(function(line) { return line.split(' ')[0].slice(1); }).filter(function(line) { return (line.length && line != 'node'); });
lines.forEach(function(ver) {
get(ver,function(err,version,result) {
cross[version] = result;
});
});
});
});
\ No newline at end of file
export ROOTDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
export UNAME=$(uname -s);
cd $ROOTDIR
cd ../
if [ ${UNAME} = 'Darwin' ]; then
# note: requires FAT (duel-arch) node installed via .pkg
npm install --stage --target_arch=ia32
npm install --stage --target_arch=ia32 --debug
npm install --stage --target_arch=x64
npm install --stage --target_arch=x64 --debug
elif [ ${UNAME} = 'Linux' ]; then
rm -rf ./bin/linux-*
apt-get -y update
apt-get -y install git make build-essential
git clone https://github.com/creationix/nvm.git ~/.nvm
source ~/.nvm/nvm.sh
nvm install 0.10
npm install -g node-gyp
node ./build.js --target_arch=x64
# now do 32 bit
NVER=`node -v`
wget http://nodejs.org/dist/${NVER}/node-${NVER}-linux-x86.tar.gz
tar xf node-${NVER}-linux-x86.tar.gz
export PATH=$(pwd)/node-${NVER}-linux-x86/bin:$PATH
# ignore:
# dependency problems - leaving unconfigure gcc-4.6:i386 g++-4.6:i386 libstdc++6-4.6-dev:i386
# E: Sub-process /usr/bin/dpkg returned an error code (1)
apt-get -y install binutils:i386 cpp:i386 gcc-4.6:i386 g++-4.6:i386 libstdc++6-4.6-dev:i386 | true
CC=gcc-4.6 CXX=g++-4.6 node ./build.js --target_arch=ia32
fi
\ No newline at end of file
var ProgressBar = require('progress');
var http = require('http');
var url = require('url');
function download(from,options,callback) {
var options = options || {};
var uri = url.parse(from);
var req = http.request(uri);
req.on('response', function(res){
// needed for end to be called
res.resume();
if (res.statusCode !== 200) {
return callback(new Error('Server returned '+ res.statusCode));
}
if (options.progress) {
var len = parseInt(res.headers['content-length'], 10);
console.log();
var bar = new ProgressBar('Downloading [:bar] :percent :etas', {
complete: '='
, incomplete: ' '
, width: 40
, total: len
});
}
function returnBuffer() {
// todo - use http://nodejs.org/api/buffer.html#buffer_class_method_buffer_concat_list_totallength
for (var length = 0, i = 0; i < out.length; ++i) {
length += out[i].length;
}
var result = new Buffer(length);
for (var pos = 0, j = 0; j < out.length; ++j) {
out[j].copy(result, pos);
pos += out[j].length;
}
return callback(null,result);
}
var out = [];
res.on('data', function(chunk) {
if (options.progress) bar.tick(chunk.length);
out.push(chunk);
});
res.on('end', function(){
if (options.progress) console.log('\n');
returnBuffer();
});
res.on('close', function(){
returnBuffer();
});
});
req.on('error', function(err){
callback(err);
});
req.end();
}
function parse_args(_args, opts) {
// first split them like npm returns
var args = [];
_args.forEach(function(a) {
var parts = a.split('=');
parts.forEach(function(p) {
args.push(p);
})
})
// respect flags passed to npm install
if (process.env.npm_config_argv) {
var argv_obj = JSON.parse(process.env.npm_config_argv);
args = args.concat(argv_obj.cooked.slice(1))
}
var debug = (args.indexOf('--debug') > -1);
if (debug) opts.configuration = 'Debug';
opts.stage = (args.indexOf('--stage') > -1);
if (opts.stage) {
opts.force = true;
} else {
var from_source = args.indexOf('--build-from-source');
if ( from_source > -1) {
// no specific module name passed
var next_arg = args[from_source+1];
if (!next_arg || next_arg.indexOf('--') <= 0) {
opts.force = true;
} else if (next_arg == 'sqlite3'){
opts.force = true;
}
}
}
var target_arch = args.indexOf('--target_arch');
if (target_arch > -1) {
var next_arg = args[target_arch+1];
if (next_arg && next_arg.indexOf('--') < 0) {
opts.target_arch = next_arg;
}
}
opts.args = args;
return opts;
}
module.exports.parse_args = parse_args;
module.exports.download = download;
\ No newline at end of file
export ROOTDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
export DRY_RUN="--dry-run"
export PATTERN="*.*"
export CHECK_MD5="--no-check-md5"
function make_shas {
for i in $(ls *.tar.gz); do
shasum_file="${i//.tar.gz/.sha1.txt}";
if [ ! -f "${shasum_file}" ]; then
echo generating "${shasum_file}"
shasum $i | awk '{print $1}' > "${shasum_file}"
fi
done
}
cd ${ROOTDIR}/../stage/
if [ -d Debug ]; then
cd Debug
make_shas
../../../s3cmd/s3cmd sync --acl-public ${CHECK_MD5} ./${PATTERN} s3://node-sqlite3/Debug/ ${DRY_RUN}
cd ../
fi
if [ -d Release ]; then
cd Release
make_shas
../../../s3cmd/s3cmd sync --acl-public ${CHECK_MD5} ./${PATTERN} s3://node-sqlite3/Release/ ${DRY_RUN}
cd ../
fi
#../../s3cmd/s3cmd ls s3://node-sqlite3/
#!/usr/bin/env node
/*
TODO
- verbose/quiet mode
- travis/nvm/32bit auto-build and post to s3 for linux
- cloudfront + logging
- script to check for acl-public
- use require() to support node_modules location of binary?
- consider json config for configuring build and for handling routing remotely
- drop tar.gz - use node-tar directly - https://github.com/isaacs/node-tar/issues/11
*/
var package_json = require('./package.json');
var Binary = require('./lib/binary_name.js').Binary;
var util = require('./build-util/tools.js');
var mkdirp = require('mkdirp');
var targz = require('tar.gz');
var cp = require('child_process');
var fs = require('fs');
var path = require('path');
var os = require('os');
var crypto = require('crypto');
var opts = {
name: 'node_sqlite3',
force: false,
stage: false,
configuration: 'Release',
target_arch: process.arch,
platform: process.platform,
uri: 'http://node-sqlite3.s3.amazonaws.com/',
paths: {}
}
function log(msg) {
console.log('['+package_json.name+']: ' + msg);
}
// only for dev
function log_debug(msg) {
//log(msg);
}
function done(err) {
if (err) {
log(err);
process.exit(1);
}
process.exit(0);
}
function test(opts,try_build,callback) {
fs.statSync(opts.paths.runtime_module_path);
var args = [];
var shell_cmd;
var arch_names = {
'ia32':'-i386',
'x64':'-x86_64'
}
if (process.platform === 'darwin' && arch_names[opts.target_arch]) {
shell_cmd = 'arch';
args.push(arch_names[opts.target_arch]);
args.push(process.execPath);
} else if (process.arch == opts.target_arch) {
shell_cmd = process.execPath;
}
if (!shell_cmd) {
// system we cannot test on - likely since we are cross compiling
log("Skipping testing binary for " + process.target_arch);
return callback();
}
args.push('lib/sqlite3');
cp.execFile(shell_cmd, args, function(err, stdout, stderr) {
if (err || stderr) {
var output = err.message || stderr;
log('Testing the binary failed: "' + output + '"');
if (try_build) {
log('Attempting source compile...');
build(opts,callback);
}
} else {
log('Sweet: "' + opts.binary.filename() + '" is valid, node-sqlite3 is now installed!');
return callback();
}
});
}
function build(opts,callback) {
var shell_cmd = process.platform === 'win32' ? 'node-gyp.cmd' : 'node-gyp';
var shell_args = ['rebuild'].concat(opts.args);
var cmd = cp.spawn(shell_cmd,shell_args);
cmd.on('error', function(err) {
if (err) {
return callback(new Error("Failed to execute '" + shell_cmd + ' ' + shell_args.join(' ') + "' (" + err + ")"));
}
});
cmd.stdout.on('data',function(data) {
console.log(data.slice(0,data.length-1).toString());
})
// TODO - this node-gyp output comes through formatted poorly, hence disabled
/*
cmd.stderr.on('data',function(data) {
console.error(data.slice(0,data.length-1).toString());
})
*/
cmd.on('exit', function(err) {
if (err) {
if (err === 127) {
console.error(
'node-gyp not found! Please upgrade your install of npm! You need at least 1.1.5 (I think) '+
'and preferably 1.1.30.'
);
} else {
console.error('Build failed');
}
return callback(err);
}
move(opts,callback);
});
}
function tarball(opts,callback) {
var source = path.dirname(opts.paths.staged_module_file_name);
log('compressing: ' + source + ' to ' + opts.paths.tarball_path);
new targz(9).compress(source, opts.paths.tarball_path, function(err) {
if (err) return callback(err);
log('Versioned binary staged for upload at ' + opts.paths.tarball_path);
var sha1 = crypto.createHash('sha1');
fs.readFile(opts.paths.tarball_path,function(err,buffer) {
if (err) return callback(err);
sha1.update(buffer);
log('Writing shasum at ' + opts.paths.tarball_shasum);
fs.writeFile(opts.paths.tarball_shasum,sha1.digest('hex'),callback);
});
});
}
function move(opts,callback) {
try {
fs.statSync(opts.paths.build_module_path);
} catch (ex) {
return callback(new Error('Build succeeded but target not found at ' + opts.paths.build_module_path));
}
try {
mkdirp.sync(path.dirname(opts.paths.runtime_module_path));
log('Created: ' + path.dirname(opts.paths.runtime_module_path));
} catch (err) {
log_debug(err);
}
fs.renameSync(opts.paths.build_module_path,opts.paths.runtime_module_path);
if (opts.stage) {
try {
mkdirp.sync(path.dirname(opts.paths.staged_module_file_name));
log('Created: ' + path.dirname(opts.paths.staged_module_file_name))
} catch (err) {
log_debug(err);
}
fs.writeFileSync(opts.paths.staged_module_file_name,fs.readFileSync(opts.paths.runtime_module_path));
// drop build metadata into build folder
var metapath = path.join(path.dirname(opts.paths.staged_module_file_name),'build-info.json');
// more build info
opts.date = new Date();
opts.node_features = process.features;
opts.versions = process.versions;
opts.config = process.config;
opts.execPath = process.execPath;
fs.writeFileSync(metapath,JSON.stringify(opts,null,2));
tarball(opts,callback);
} else {
log('Installed in ' + opts.paths.runtime_module_path + '');
test(opts,false,callback);
}
}
function rel(p) {
return path.relative(process.cwd(),p);
}
var opts = util.parse_args(process.argv.slice(2),opts);
opts.binary = new Binary(opts);
var versioned = opts.binary.getRequirePath();
opts.paths.runtime_module_path = rel(path.join(__dirname, 'lib', versioned));
opts.paths.runtime_folder = rel(path.join(__dirname, 'lib', 'binding',opts.binary.configuration));
var staged_module_path = path.join(__dirname, 'stage', opts.binary.getModuleAbi(), opts.binary.getBasePath());
opts.paths.staged_module_file_name = rel(path.join(staged_module_path,opts.binary.filename()));
opts.paths.build_module_path = rel(path.join(__dirname, 'build', opts.binary.configuration, opts.binary.filename()));
opts.paths.tarball_path = rel(path.join(__dirname, 'stage', opts.binary.configuration, opts.binary.getArchivePath()));
opts.paths.tarball_shasum = opts.paths.tarball_path.replace(opts.binary.compression(),'.sha1.txt');
if (!{ia32: true, x64: true, arm: true}.hasOwnProperty(opts.target_arch)) {
return done(new Error('Unsupported (?) architecture: '+ opts.target_arch+ ''));
}
if (opts.force) {
build(opts,done);
} else {
try {
test(opts,true,done);
} catch (ex) {
var from = opts.binary.getRemotePath();
var tmpdirbase = '/tmp/';
if (os.tmpdir) {
tmpdirbase = os.tmpdir();
}
var tmpdir = path.join(tmpdirbase,'node-sqlite3-'+opts.binary.configuration);
try {
mkdirp.sync(tmpdir);
} catch (err) {
log_debug(err);
}
log('Checking for ' + from);
util.download(from,{progress:false}, function(err,buffer) {
if (err) {
log(from + ' not found, falling back to source compile (' + err + ')');
return build(opts,done);
}
// calculate shasum of tarball
var sha1 = crypto.createHash('sha1');
sha1.update(buffer);
var actual_shasum = sha1.digest('hex');
// write local tarball now to make debugging easier if following checks fail
var tmpfile = path.join(tmpdir,path.basename(from));
fs.writeFile(tmpfile,buffer,function(err) {
if (err) return done(err);
log('Downloaded to: '+ tmpfile);
// fetch shasum expected value
var from_shasum = from.replace(opts.binary.compression(),'.sha1.txt');
log('Checking for ' + from_shasum);
util.download(from_shasum,{progress:false},function(err,expected_shasum_buffer) {
if (err) {
log(from_shasum + ' not found, skipping shasum check (' + err + ')');
} else {
// now check shasum match
var expected = expected_shasum_buffer.toString().trim();
if (expected !== actual_shasum) {
return done(new Error("shasum does not match between remote and local: " + expected + ' ' + actual_shasum));
} else {
log('Sha1sum matches! ' + expected);
}
// we are good: continue
log('Extracting to ' + opts.paths.runtime_folder);
new targz().extract(tmpfile, opts.paths.runtime_folder, function(err) {
if (err) return done(err);
try {
return test(opts,true,done);
} catch (ex) {
// Stat failed
log(opts.paths.runtime_folder + ' not found, falling back to source compile');
return build(opts,done);
}
});
}
});
});
});
}
}
{
'variables': {
'sqlite_version%':'3071700'
}
}
\ No newline at end of file
import sys
import tarfile
import os
tarball = os.path.abspath(sys.argv[1])
dirname = os.path.abspath(sys.argv[2])
tfile = tarfile.open(tarball,'r:gz');
tfile.extractall(dirname)
sys.exit(0)
{
'includes': [ 'common-sqlite.gypi' ],
'target_defaults': {
'default_configuration': 'Debug',
'configurations': {
......@@ -39,11 +40,39 @@
'targets': [
{
'target_name': 'action_before_build',
'type': 'none',
'hard_dependency': 1,
'actions': [
{
'action_name': 'unpack_sqlite_dep',
'inputs': [
'./sqlite-autoconf-<@(sqlite_version).tar.gz'
],
'outputs': [
'<(SHARED_INTERMEDIATE_DIR)/sqlite-autoconf-<@(sqlite_version)/sqlite3.c'
],
'action': ['python','./extract.py','./sqlite-autoconf-<@(sqlite_version).tar.gz','<(SHARED_INTERMEDIATE_DIR)']
}
],
'direct_dependent_settings': {
'include_dirs': [
'<(SHARED_INTERMEDIATE_DIR)/sqlite-autoconf-<@(sqlite_version)/',
]
},
},
{
'target_name': 'sqlite3',
'type': 'static_library',
'include_dirs': [ '.' ],
'include_dirs': [ '<(SHARED_INTERMEDIATE_DIR)/sqlite-autoconf-<@(sqlite_version)/' ],
'dependencies': [
'action_before_build'
],
'sources': [
'<(SHARED_INTERMEDIATE_DIR)/sqlite-autoconf-<@(sqlite_version)/sqlite3.c'
],
'direct_dependent_settings': {
'include_dirs': [ '.' ],
'include_dirs': [ '<(SHARED_INTERMEDIATE_DIR)/sqlite-autoconf-<@(sqlite_version)/' ],
'defines': [
'SQLITE_THREADSAFE=1',
'SQLITE_ENABLE_FTS3',
......@@ -56,14 +85,9 @@
'SQLITE_ENABLE_FTS3',
'SQLITE_ENABLE_RTREE'
],
'sources': [ './sqlite3.c', ],
},
{
'target_name': 'shell',
'type': 'executable',
'dependencies': [ 'sqlite3' ],
'sources': [ './shell.c' ]
'export_dependent_settings': [
'action_before_build',
]
}
]
}
Installation Instructions
*************************
Copyright (C) 1994, 1995, 1996, 1999, 2000, 2001, 2002, 2004, 2005 Free
Software Foundation, Inc.
This file is free documentation; the Free Software Foundation gives
unlimited permission to copy, distribute and modify it.
Basic Installation
==================
These are generic installation instructions.
The `configure' shell script attempts to guess correct values for
various system-dependent variables used during compilation. It uses
those values to create a `Makefile' in each directory of the package.
It may also create one or more `.h' files containing system-dependent
definitions. Finally, it creates a shell script `config.status' that
you can run in the future to recreate the current configuration, and a
file `config.log' containing compiler output (useful mainly for
debugging `configure').
It can also use an optional file (typically called `config.cache'
and enabled with `--cache-file=config.cache' or simply `-C') that saves
the results of its tests to speed up reconfiguring. (Caching is
disabled by default to prevent problems with accidental use of stale
cache files.)
If you need to do unusual things to compile the package, please try
to figure out how `configure' could check whether to do them, and mail
diffs or instructions to the address given in the `README' so they can
be considered for the next release. If you are using the cache, and at
some point `config.cache' contains results you don't want to keep, you
may remove or edit it.
The file `configure.ac' (or `configure.in') is used to create
`configure' by a program called `autoconf'. You only need
`configure.ac' if you want to change it or regenerate `configure' using
a newer version of `autoconf'.
The simplest way to compile this package is:
1. `cd' to the directory containing the package's source code and type
`./configure' to configure the package for your system. If you're
using `csh' on an old version of System V, you might need to type
`sh ./configure' instead to prevent `csh' from trying to execute
`configure' itself.
Running `configure' takes awhile. While running, it prints some
messages telling which features it is checking for.
2. Type `make' to compile the package.
3. Optionally, type `make check' to run any self-tests that come with
the package.
4. Type `make install' to install the programs and any data files and
documentation.
5. You can remove the program binaries and object files from the
source code directory by typing `make clean'. To also remove the
files that `configure' created (so you can compile the package for
a different kind of computer), type `make distclean'. There is
also a `make maintainer-clean' target, but that is intended mainly
for the package's developers. If you use it, you may have to get
all sorts of other programs in order to regenerate files that came
with the distribution.
Compilers and Options
=====================
Some systems require unusual options for compilation or linking that the
`configure' script does not know about. Run `./configure --help' for
details on some of the pertinent environment variables.
You can give `configure' initial values for configuration parameters
by setting variables in the command line or in the environment. Here
is an example:
./configure CC=c89 CFLAGS=-O2 LIBS=-lposix
*Note Defining Variables::, for more details.
Compiling For Multiple Architectures
====================================
You can compile the package for more than one kind of computer at the
same time, by placing the object files for each architecture in their
own directory. To do this, you must use a version of `make' that
supports the `VPATH' variable, such as GNU `make'. `cd' to the
directory where you want the object files and executables to go and run
the `configure' script. `configure' automatically checks for the
source code in the directory that `configure' is in and in `..'.
If you have to use a `make' that does not support the `VPATH'
variable, you have to compile the package for one architecture at a
time in the source code directory. After you have installed the
package for one architecture, use `make distclean' before reconfiguring
for another architecture.
Installation Names
==================
By default, `make install' installs the package's commands under
`/usr/local/bin', include files under `/usr/local/include', etc. You
can specify an installation prefix other than `/usr/local' by giving
`configure' the option `--prefix=PREFIX'.
You can specify separate installation prefixes for
architecture-specific files and architecture-independent files. If you
pass the option `--exec-prefix=PREFIX' to `configure', the package uses
PREFIX as the prefix for installing programs and libraries.
Documentation and other data files still use the regular prefix.
In addition, if you use an unusual directory layout you can give
options like `--bindir=DIR' to specify different values for particular
kinds of files. Run `configure --help' for a list of the directories
you can set and what kinds of files go in them.
If the package supports it, you can cause programs to be installed
with an extra prefix or suffix on their names by giving `configure' the
option `--program-prefix=PREFIX' or `--program-suffix=SUFFIX'.
Optional Features
=================
Some packages pay attention to `--enable-FEATURE' options to
`configure', where FEATURE indicates an optional part of the package.
They may also pay attention to `--with-PACKAGE' options, where PACKAGE
is something like `gnu-as' or `x' (for the X Window System). The
`README' should mention any `--enable-' and `--with-' options that the
package recognizes.
For packages that use the X Window System, `configure' can usually
find the X include and library files automatically, but if it doesn't,
you can use the `configure' options `--x-includes=DIR' and
`--x-libraries=DIR' to specify their locations.
Specifying the System Type
==========================
There may be some features `configure' cannot figure out automatically,
but needs to determine by the type of machine the package will run on.
Usually, assuming the package is built to be run on the _same_
architectures, `configure' can figure that out, but if it prints a
message saying it cannot guess the machine type, give it the
`--build=TYPE' option. TYPE can either be a short name for the system
type, such as `sun4', or a canonical name which has the form:
CPU-COMPANY-SYSTEM
where SYSTEM can have one of these forms:
OS KERNEL-OS
See the file `config.sub' for the possible values of each field. If
`config.sub' isn't included in this package, then this package doesn't
need to know the machine type.
If you are _building_ compiler tools for cross-compiling, you should
use the option `--target=TYPE' to select the type of system they will
produce code for.
If you want to _use_ a cross compiler, that generates code for a
platform different from the build platform, you should specify the
"host" platform (i.e., that on which the generated programs will
eventually be run) with `--host=TYPE'.
Sharing Defaults
================
If you want to set default values for `configure' scripts to share, you
can create a site shell script called `config.site' that gives default
values for variables like `CC', `cache_file', and `prefix'.
`configure' looks for `PREFIX/share/config.site' if it exists, then
`PREFIX/etc/config.site' if it exists. Or, you can set the
`CONFIG_SITE' environment variable to the location of the site script.
A warning: not all `configure' scripts look for a site script.
Defining Variables
==================
Variables not defined in a site shell script can be set in the
environment passed to `configure'. However, some packages may run
configure again during the build, and the customized values of these
variables may be lost. In order to avoid this problem, you should set
them in the `configure' command line, using `VAR=value'. For example:
./configure CC=/usr/local2/bin/gcc
causes the specified `gcc' to be used as the C compiler (unless it is
overridden in the site shell script). Here is a another example:
/bin/bash ./configure CONFIG_SHELL=/bin/bash
Here the `CONFIG_SHELL=/bin/bash' operand causes subsequent
configuration-related scripts to be executed by `/bin/bash'.
`configure' Invocation
======================
`configure' recognizes the following options to control how it operates.
`--help'
`-h'
Print a summary of the options to `configure', and exit.
`--version'
`-V'
Print the version of Autoconf used to generate the `configure'
script, and exit.
`--cache-file=FILE'
Enable the cache: use and save the results of the tests in FILE,
traditionally `config.cache'. FILE defaults to `/dev/null' to
disable caching.
`--config-cache'
`-C'
Alias for `--cache-file=config.cache'.
`--quiet'
`--silent'
`-q'
Do not print messages saying which checks are being made. To
suppress all normal output, redirect it to `/dev/null' (any error
messages will still be shown).
`--srcdir=DIR'
Look for the package's source code in directory DIR. Usually
`configure' can determine that directory automatically.
`configure' also accepts some other, not widely useful, options. Run
`configure --help' for more details.
AM_CFLAGS = @THREADSAFE_FLAGS@ @DYNAMIC_EXTENSION_FLAGS@ -DSQLITE_ENABLE_FTS3 -DSQLITE_ENABLE_RTREE
lib_LTLIBRARIES = libsqlite3.la
libsqlite3_la_SOURCES = sqlite3.c
libsqlite3_la_LDFLAGS = -no-undefined -version-info 8:6:8
bin_PROGRAMS = sqlite3
sqlite3_SOURCES = shell.c sqlite3.h
sqlite3_LDADD = $(top_builddir)/libsqlite3.la @READLINE_LIBS@
sqlite3_DEPENDENCIES = $(top_builddir)/libsqlite3.la
include_HEADERS = sqlite3.h sqlite3ext.h
EXTRA_DIST = sqlite3.pc sqlite3.1 tea
pkgconfigdir = ${libdir}/pkgconfig
pkgconfig_DATA = sqlite3.pc
man_MANS = sqlite3.1
This package contains:
* the SQLite library amalgamation (single file) source code distribution,
* the shell.c file used to build the sqlite3 shell too, and
* the sqlite3.h and sqlite3ext.h header files required to link programs
and sqlite extensions against the installed libary.
* autoconf/automake installation infrastucture.
The generic installation instructions for autoconf/automake are found
in the INSTALL file.
The following SQLite specific boolean options are supported:
--enable-readline use readline in shell tool [default=yes]
--enable-threadsafe build a thread-safe library [default=yes]
--enable-dynamic-extensions support loadable extensions [default=yes]
The default value for the CFLAGS variable (options passed to the C
compiler) includes debugging symbols in the build, resulting in larger
binaries than are necessary. Override it on the configure command
line like this:
$ CFLAGS="-Os" ./configure
to produce a smaller installation footprint.
Other SQLite compilation parameters can also be set using CFLAGS. For
example:
$ CFLAGS="-Os -DSQLITE_OMIT_TRIGGERS" ./configure
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
#-----------------------------------------------------------------------
# Supports the following non-standard switches.
#
# --enable-threadsafe
# --enable-readline
# --enable-dynamic-extensions
#
AC_PREREQ(2.61)
AC_INIT(sqlite, 3.7.13, http://www.sqlite.org)
AC_CONFIG_SRCDIR([sqlite3.c])
# Use automake.
AM_INIT_AUTOMAKE([foreign])
AC_SYS_LARGEFILE
# Check for required programs.
AC_PROG_CC
AC_PROG_RANLIB
AC_PROG_LIBTOOL
AC_PROG_MKDIR_P
# Check for library functions that SQLite can optionally use.
AC_CHECK_FUNCS([fdatasync usleep fullfsync localtime_r gmtime_r])
AC_FUNC_STRERROR_R
AC_CONFIG_FILES([Makefile sqlite3.pc])
AC_SUBST(BUILD_CFLAGS)
#-----------------------------------------------------------------------
# --enable-readline
#
AC_ARG_ENABLE(readline, [AS_HELP_STRING(
[--enable-readline],
[use readline in shell tool (yes, no) [default=yes]])],
[], [enable_readline=yes])
if test x"$enable_readline" != xno ; then
sLIBS=$LIBS
LIBS=""
AC_SEARCH_LIBS(tgetent, curses ncurses ncursesw, [], [])
AC_SEARCH_LIBS(readline, readline, [], [enable_readline=no])
AC_CHECK_FUNCS(readline, [], [])
READLINE_LIBS=$LIBS
LIBS=$sLIBS
fi
AC_SUBST(READLINE_LIBS)
#-----------------------------------------------------------------------
#-----------------------------------------------------------------------
# --enable-threadsafe
#
AC_ARG_ENABLE(threadsafe, [AS_HELP_STRING(
[--enable-threadsafe], [build a thread-safe library [default=yes]])],
[], [enable_threadsafe=yes])
THREADSAFE_FLAGS=-DSQLITE_THREADSAFE=0
if test x"$enable_threadsafe" != "xno"; then
THREADSAFE_FLAGS="-D_REENTRANT=1 -DSQLITE_THREADSAFE=1"
AC_SEARCH_LIBS(pthread_create, pthread)
fi
AC_SUBST(THREADSAFE_FLAGS)
#-----------------------------------------------------------------------
#-----------------------------------------------------------------------
# --enable-dynamic-extensions
#
AC_ARG_ENABLE(dynamic-extensions, [AS_HELP_STRING(
[--enable-dynamic-extensions], [support loadable extensions [default=yes]])],
[], [enable_dynamic_extensions=yes])
if test x"$enable_dynamic_extensions" != "xno"; then
AC_SEARCH_LIBS(dlopen, dl)
else
DYNAMIC_EXTENSION_FLAGS=-DSQLITE_OMIT_LOAD_EXTENSION=1
fi
AC_MSG_CHECKING([for whether to support dynamic extensions])
AC_MSG_RESULT($enable_dynamic_extensions)
AC_SUBST(DYNAMIC_EXTENSION_FLAGS)
#-----------------------------------------------------------------------
AC_CHECK_FUNCS(posix_fallocate)
#-----------------------------------------------------------------------
# UPDATE: Maybe it's better if users just set CFLAGS before invoking
# configure. This option doesn't really add much...
#
# --enable-tempstore
#
# AC_ARG_ENABLE(tempstore, [AS_HELP_STRING(
# [--enable-tempstore],
# [in-memory temporary tables (never, no, yes, always) [default=no]])],
# [], [enable_tempstore=no])
# AC_MSG_CHECKING([for whether or not to store temp tables in-memory])
# case "$enable_tempstore" in
# never ) TEMP_STORE=0 ;;
# no ) TEMP_STORE=1 ;;
# always ) TEMP_STORE=3 ;;
# yes ) TEMP_STORE=3 ;;
# * )
# TEMP_STORE=1
# enable_tempstore=yes
# ;;
# esac
# AC_MSG_RESULT($enable_tempstore)
# AC_SUBST(TEMP_STORE)
#-----------------------------------------------------------------------
AC_OUTPUT
#!/bin/sh
# install - install a program, script, or datafile
scriptversion=2005-05-14.22
# This originates from X11R5 (mit/util/scripts/install.sh), which was
# later released in X11R6 (xc/config/util/install.sh) with the
# following copyright and license.
#
# Copyright (C) 1994 X Consortium
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to
# deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# X CONSORTIUM BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN
# AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNEC-
# TION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# Except as contained in this notice, the name of the X Consortium shall not
# be used in advertising or otherwise to promote the sale, use or other deal-
# ings in this Software without prior written authorization from the X Consor-
# tium.
#
#
# FSF changes to this file are in the public domain.
#
# Calling this script install-sh is preferred over install.sh, to prevent
# `make' implicit rules from creating a file called install from it
# when there is no Makefile.
#
# This script is compatible with the BSD install script, but was written
# from scratch. It can only install one file at a time, a restriction
# shared with many OS's install programs.
# set DOITPROG to echo to test this script
# Don't use :- since 4.3BSD and earlier shells don't like it.
doit="${DOITPROG-}"
# put in absolute paths if you don't have them in your path; or use env. vars.
mvprog="${MVPROG-mv}"
cpprog="${CPPROG-cp}"
chmodprog="${CHMODPROG-chmod}"
chownprog="${CHOWNPROG-chown}"
chgrpprog="${CHGRPPROG-chgrp}"
stripprog="${STRIPPROG-strip}"
rmprog="${RMPROG-rm}"
mkdirprog="${MKDIRPROG-mkdir}"
chmodcmd="$chmodprog 0755"
chowncmd=
chgrpcmd=
stripcmd=
rmcmd="$rmprog -f"
mvcmd="$mvprog"
src=
dst=
dir_arg=
dstarg=
no_target_directory=
usage="Usage: $0 [OPTION]... [-T] SRCFILE DSTFILE
or: $0 [OPTION]... SRCFILES... DIRECTORY
or: $0 [OPTION]... -t DIRECTORY SRCFILES...
or: $0 [OPTION]... -d DIRECTORIES...
In the 1st form, copy SRCFILE to DSTFILE.
In the 2nd and 3rd, copy all SRCFILES to DIRECTORY.
In the 4th, create DIRECTORIES.
Options:
-c (ignored)
-d create directories instead of installing files.
-g GROUP $chgrpprog installed files to GROUP.
-m MODE $chmodprog installed files to MODE.
-o USER $chownprog installed files to USER.
-s $stripprog installed files.
-t DIRECTORY install into DIRECTORY.
-T report an error if DSTFILE is a directory.
--help display this help and exit.
--version display version info and exit.
Environment variables override the default commands:
CHGRPPROG CHMODPROG CHOWNPROG CPPROG MKDIRPROG MVPROG RMPROG STRIPPROG
"
while test -n "$1"; do
case $1 in
-c) shift
continue;;
-d) dir_arg=true
shift
continue;;
-g) chgrpcmd="$chgrpprog $2"
shift
shift
continue;;
--help) echo "$usage"; exit $?;;
-m) chmodcmd="$chmodprog $2"
shift
shift
continue;;
-o) chowncmd="$chownprog $2"
shift
shift
continue;;
-s) stripcmd=$stripprog
shift
continue;;
-t) dstarg=$2
shift
shift
continue;;
-T) no_target_directory=true
shift
continue;;
--version) echo "$0 $scriptversion"; exit $?;;
*) # When -d is used, all remaining arguments are directories to create.
# When -t is used, the destination is already specified.
test -n "$dir_arg$dstarg" && break
# Otherwise, the last argument is the destination. Remove it from $@.
for arg
do
if test -n "$dstarg"; then
# $@ is not empty: it contains at least $arg.
set fnord "$@" "$dstarg"
shift # fnord
fi
shift # arg
dstarg=$arg
done
break;;
esac
done
if test -z "$1"; then
if test -z "$dir_arg"; then
echo "$0: no input file specified." >&2
exit 1
fi
# It's OK to call `install-sh -d' without argument.
# This can happen when creating conditional directories.
exit 0
fi
for src
do
# Protect names starting with `-'.
case $src in
-*) src=./$src ;;
esac
if test -n "$dir_arg"; then
dst=$src
src=
if test -d "$dst"; then
mkdircmd=:
chmodcmd=
else
mkdircmd=$mkdirprog
fi
else
# Waiting for this to be detected by the "$cpprog $src $dsttmp" command
# might cause directories to be created, which would be especially bad
# if $src (and thus $dsttmp) contains '*'.
if test ! -f "$src" && test ! -d "$src"; then
echo "$0: $src does not exist." >&2
exit 1
fi
if test -z "$dstarg"; then
echo "$0: no destination specified." >&2
exit 1
fi
dst=$dstarg
# Protect names starting with `-'.
case $dst in
-*) dst=./$dst ;;
esac
# If destination is a directory, append the input filename; won't work
# if double slashes aren't ignored.
if test -d "$dst"; then
if test -n "$no_target_directory"; then
echo "$0: $dstarg: Is a directory" >&2
exit 1
fi
dst=$dst/`basename "$src"`
fi
fi
# This sed command emulates the dirname command.
dstdir=`echo "$dst" | sed -e 's,/*$,,;s,[^/]*$,,;s,/*$,,;s,^$,.,'`
# Make sure that the destination directory exists.
# Skip lots of stat calls in the usual case.
if test ! -d "$dstdir"; then
defaultIFS='
'
IFS="${IFS-$defaultIFS}"
oIFS=$IFS
# Some sh's can't handle IFS=/ for some reason.
IFS='%'
set x `echo "$dstdir" | sed -e 's@/@%@g' -e 's@^%@/@'`
shift
IFS=$oIFS
pathcomp=
while test $# -ne 0 ; do
pathcomp=$pathcomp$1
shift
if test ! -d "$pathcomp"; then
$mkdirprog "$pathcomp"
# mkdir can fail with a `File exist' error in case several
# install-sh are creating the directory concurrently. This
# is OK.
test -d "$pathcomp" || exit
fi
pathcomp=$pathcomp/
done
fi
if test -n "$dir_arg"; then
$doit $mkdircmd "$dst" \
&& { test -z "$chowncmd" || $doit $chowncmd "$dst"; } \
&& { test -z "$chgrpcmd" || $doit $chgrpcmd "$dst"; } \
&& { test -z "$stripcmd" || $doit $stripcmd "$dst"; } \
&& { test -z "$chmodcmd" || $doit $chmodcmd "$dst"; }
else
dstfile=`basename "$dst"`
# Make a couple of temp file names in the proper directory.
dsttmp=$dstdir/_inst.$$_
rmtmp=$dstdir/_rm.$$_
# Trap to clean up those temp files at exit.
trap 'ret=$?; rm -f "$dsttmp" "$rmtmp" && exit $ret' 0
trap '(exit $?); exit' 1 2 13 15
# Copy the file name to the temp name.
$doit $cpprog "$src" "$dsttmp" &&
# and set any options; do chmod last to preserve setuid bits.
#
# If any of these fail, we abort the whole thing. If we want to
# ignore errors from any of these, just make sure not to ignore
# errors from the above "$doit $cpprog $src $dsttmp" command.
#
{ test -z "$chowncmd" || $doit $chowncmd "$dsttmp"; } \
&& { test -z "$chgrpcmd" || $doit $chgrpcmd "$dsttmp"; } \
&& { test -z "$stripcmd" || $doit $stripcmd "$dsttmp"; } \
&& { test -z "$chmodcmd" || $doit $chmodcmd "$dsttmp"; } &&
# Now rename the file to the real destination.
{ $doit $mvcmd -f "$dsttmp" "$dstdir/$dstfile" 2>/dev/null \
|| {
# The rename failed, perhaps because mv can't rename something else
# to itself, or perhaps because mv is so ancient that it does not
# support -f.
# Now remove or move aside any old file at destination location.
# We try this two ways since rm can't unlink itself on some
# systems and the destination file might be busy for other
# reasons. In this case, the final cleanup might fail but the new
# file should still install successfully.
{
if test -f "$dstdir/$dstfile"; then
$doit $rmcmd -f "$dstdir/$dstfile" 2>/dev/null \
|| $doit $mvcmd -f "$dstdir/$dstfile" "$rmtmp" 2>/dev/null \
|| {
echo "$0: cannot unlink or rename $dstdir/$dstfile" >&2
(exit 1); exit 1
}
else
:
fi
} &&
# Now rename the file to the real destination.
$doit $mvcmd "$dsttmp" "$dstdir/$dstfile"
}
}
fi || { (exit 1); exit 1; }
done
# The final little trick to "correctly" pass the exit status to the exit trap.
{
(exit 0); exit 0
}
# Local variables:
# eval: (add-hook 'write-file-hooks 'time-stamp)
# time-stamp-start: "scriptversion="
# time-stamp-format: "%:y-%02m-%02d.%02H"
# time-stamp-end: "$"
# End:
This source diff could not be displayed because it is too large. You can view the blob instead.
.\" Hey, EMACS: -*- nroff -*-
.\" First parameter, NAME, should be all caps
.\" Second parameter, SECTION, should be 1-8, maybe w/ subsection
.\" other parameters are allowed: see man(7), man(1)
.TH SQLITE3 1 "Mon Apr 15 23:49:17 2002"
.\" Please adjust this date whenever revising the manpage.
.\"
.\" Some roff macros, for reference:
.\" .nh disable hyphenation
.\" .hy enable hyphenation
.\" .ad l left justify
.\" .ad b justify to both left and right margins
.\" .nf disable filling
.\" .fi enable filling
.\" .br insert line break
.\" .sp <n> insert n+1 empty lines
.\" for manpage-specific macros, see man(7)
.SH NAME
.B sqlite3
\- A command line interface for SQLite version 3
.SH SYNOPSIS
.B sqlite3
.RI [ options ]
.RI [ databasefile ]
.RI [ SQL ]
.SH SUMMARY
.PP
.B sqlite3
is a terminal-based front-end to the SQLite library that can evaluate
queries interactively and display the results in multiple formats.
.B sqlite3
can also be used within shell scripts and other applications to provide
batch processing features.
.SH DESCRIPTION
To start a
.B sqlite3
interactive session, invoke the
.B sqlite3
command and optionally provide the name of a database file. If the
database file does not exist, it will be created. If the database file
does exist, it will be opened.
For example, to create a new database file named "mydata.db", create
a table named "memos" and insert a couple of records into that table:
.sp
$
.B sqlite3 mydata.db
.br
SQLite version 3.1.3
.br
Enter ".help" for instructions
.br
sqlite>
.B create table memos(text, priority INTEGER);
.br
sqlite>
.B insert into memos values('deliver project description', 10);
.br
sqlite>
.B insert into memos values('lunch with Christine', 100);
.br
sqlite>
.B select * from memos;
.br
deliver project description|10
.br
lunch with Christine|100
.br
sqlite>
.sp
If no database name is supplied, the ATTACH sql command can be used
to attach to existing or create new database files. ATTACH can also
be used to attach to multiple databases within the same interactive
session. This is useful for migrating data between databases,
possibly changing the schema along the way.
Optionally, a SQL statement or set of SQL statements can be supplied as
a single argument. Multiple statements should be separated by
semi-colons.
For example:
.sp
$
.B sqlite3 -line mydata.db 'select * from memos where priority > 20;'
.br
text = lunch with Christine
.br
priority = 100
.br
.sp
.SS SQLITE META-COMMANDS
.PP
The interactive interpreter offers a set of meta-commands that can be
used to control the output format, examine the currently attached
database files, or perform administrative operations upon the
attached databases (such as rebuilding indices). Meta-commands are
always prefixed with a dot (.).
A list of available meta-commands can be viewed at any time by issuing
the '.help' command. For example:
.sp
sqlite>
.B .help
.nf
.cc |
.databases List names and files of attached databases
.dump ?TABLE? ... Dump the database in an SQL text format
.echo ON|OFF Turn command echo on or off
.exit Exit this program
.explain ON|OFF Turn output mode suitable for EXPLAIN on or off.
.header(s) ON|OFF Turn display of headers on or off
.help Show this message
.import FILE TABLE Import data from FILE into TABLE
.indices TABLE Show names of all indices on TABLE
.mode MODE ?TABLE? Set output mode where MODE is one of:
csv Comma-separated values
column Left-aligned columns. (See .width)
html HTML <table> code
insert SQL insert statements for TABLE
line One value per line
list Values delimited by .separator string
tabs Tab-separated values
tcl TCL list elements
.nullvalue STRING Print STRING in place of NULL values
.output FILENAME Send output to FILENAME
.output stdout Send output to the screen
.prompt MAIN CONTINUE Replace the standard prompts
.quit Exit this program
.read FILENAME Execute SQL in FILENAME
.schema ?TABLE? Show the CREATE statements
.separator STRING Change separator used by output mode and .import
.show Show the current values for various settings
.tables ?PATTERN? List names of tables matching a LIKE pattern
.timeout MS Try opening locked tables for MS milliseconds
.width NUM NUM ... Set column widths for "column" mode
sqlite>
|cc .
.sp
.fi
.SH OPTIONS
.B sqlite3
has the following options:
.TP
.BI \-init\ file
Read and execute commands from
.I file
, which can contain a mix of SQL statements and meta-commands.
.TP
.B \-echo
Print commands before execution.
.TP
.B \-[no]header
Turn headers on or off.
.TP
.B \-column
Query results will be displayed in a table like form, using
whitespace characters to separate the columns and align the
output.
.TP
.B \-html
Query results will be output as simple HTML tables.
.TP
.B \-line
Query results will be displayed with one value per line, rows
separated by a blank line. Designed to be easily parsed by
scripts or other programs
.TP
.B \-list
Query results will be displayed with the separator (|, by default)
character between each field value. The default.
.TP
.BI \-separator\ separator
Set output field separator. Default is '|'.
.TP
.BI \-nullvalue\ string
Set string used to represent NULL values. Default is ''
(empty string).
.TP
.B \-version
Show SQLite version.
.TP
.B \-help
Show help on options and exit.
.SH INIT FILE
.B sqlite3
reads an initialization file to set the configuration of the
interactive environment. Throughout initialization, any previously
specified setting can be overridden. The sequence of initialization is
as follows:
o The default configuration is established as follows:
.sp
.nf
.cc |
mode = LIST
separator = "|"
main prompt = "sqlite> "
continue prompt = " ...> "
|cc .
.sp
.fi
o If the file
.B ~/.sqliterc
exists, it is processed first.
can be found in the user's home directory, it is
read and processed. It should generally only contain meta-commands.
o If the -init option is present, the specified file is processed.
o All other command line options are processed.
.SH SEE ALSO
http://www.sqlite.org/
.br
The sqlite-doc package
.SH AUTHOR
This manual page was originally written by Andreas Rottmann
<rotty@debian.org>, for the Debian GNU/Linux system (but may be used
by others). It was subsequently revised by Bill Bumgarner <bbum@mac.com>.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
# Package Information for pkg-config
prefix=/usr/local
exec_prefix=${prefix}
libdir=${exec_prefix}/lib
includedir=${prefix}/include
Name: SQLite
Description: SQL database engine
Version: 3.7.8
Libs: -L${libdir} -lsqlite3
Libs.private: -ldl -lpthread
Cflags: -I${includedir}
# Package Information for pkg-config
prefix=@prefix@
exec_prefix=@exec_prefix@
libdir=@libdir@
includedir=@includedir@
Name: SQLite
Description: SQL database engine
Version: @PACKAGE_VERSION@
Libs: -L${libdir} -lsqlite3
Libs.private: @LIBS@
Cflags: -I${includedir}
This is the SQLite extension for Tcl using the Tcl Extension
Architecture (TEA). For additional information on SQLite see
http://www.sqlite.org/
UNIX BUILD
==========
Building under most UNIX systems is easy, just run the configure script
and then run make. For more information about the build process, see
the tcl/unix/README file in the Tcl src dist. The following minimal
example will install the extension in the /opt/tcl directory.
$ cd sqlite-*-tea
$ ./configure --prefix=/opt/tcl
$ make
$ make install
WINDOWS BUILD
=============
The recommended method to build extensions under windows is to use the
Msys + Mingw build process. This provides a Unix-style build while
generating native Windows binaries. Using the Msys + Mingw build tools
means that you can use the same configure script as per the Unix build
to create a Makefile. See the tcl/win/README file for the URL of
the Msys + Mingw download.
If you have VC++ then you may wish to use the files in the win
subdirectory and build the extension using just VC++. These files have
been designed to be as generic as possible but will require some
additional maintenance by the project developer to synchronise with
the TEA configure.in and Makefile.in files. Instructions for using the
VC++ makefile are written in the first part of the Makefile.vc
file.
#
# Include the TEA standard macro set
#
builtin(include,tclconfig/tcl.m4)
#
# Add here whatever m4 macros you want to define for your package
#
This source diff could not be displayed because it is too large. You can view the blob instead.
#!/bin/bash -norc
dnl This file is an input file used by the GNU "autoconf" program to
dnl generate the file "configure", which is run during Tcl installation
dnl to configure the system for the local environment.
#
# RCS: @(#) $Id: configure.in,v 1.43 2005/07/26 19:17:05 mdejong Exp $
#-----------------------------------------------------------------------
# Sample configure.in for Tcl Extensions. The only places you should
# need to modify this file are marked by the string __CHANGE__
#-----------------------------------------------------------------------
#-----------------------------------------------------------------------
# __CHANGE__
# Set your package name and version numbers here.
#
# This initializes the environment with PACKAGE_NAME and PACKAGE_VERSION
# set as provided. These will also be added as -D defs in your Makefile
# so you can encode the package version directly into the source files.
#-----------------------------------------------------------------------
AC_INIT([sqlite], [3.7.13])
#--------------------------------------------------------------------
# Call TEA_INIT as the first TEA_ macro to set up initial vars.
# This will define a ${TEA_PLATFORM} variable == "unix" or "windows"
# as well as PKG_LIB_FILE and PKG_STUB_LIB_FILE.
#--------------------------------------------------------------------
TEA_INIT([3.9])
AC_CONFIG_AUX_DIR(tclconfig)
#--------------------------------------------------------------------
# Load the tclConfig.sh file
#--------------------------------------------------------------------
TEA_PATH_TCLCONFIG
TEA_LOAD_TCLCONFIG
#--------------------------------------------------------------------
# Load the tkConfig.sh file if necessary (Tk extension)
#--------------------------------------------------------------------
#TEA_PATH_TKCONFIG
#TEA_LOAD_TKCONFIG
#-----------------------------------------------------------------------
# Handle the --prefix=... option by defaulting to what Tcl gave.
# Must be called after TEA_LOAD_TCLCONFIG and before TEA_SETUP_COMPILER.
#-----------------------------------------------------------------------
TEA_PREFIX
#-----------------------------------------------------------------------
# Standard compiler checks.
# This sets up CC by using the CC env var, or looks for gcc otherwise.
# This also calls AC_PROG_CC, AC_PROG_INSTALL and a few others to create
# the basic setup necessary to compile executables.
#-----------------------------------------------------------------------
TEA_SETUP_COMPILER
#-----------------------------------------------------------------------
# __CHANGE__
# Specify the C source files to compile in TEA_ADD_SOURCES,
# public headers that need to be installed in TEA_ADD_HEADERS,
# stub library C source files to compile in TEA_ADD_STUB_SOURCES,
# and runtime Tcl library files in TEA_ADD_TCL_SOURCES.
# This defines PKG(_STUB)_SOURCES, PKG(_STUB)_OBJECTS, PKG_HEADERS
# and PKG_TCL_SOURCES.
#-----------------------------------------------------------------------
TEA_ADD_SOURCES([tclsqlite3.c])
TEA_ADD_HEADERS([])
TEA_ADD_INCLUDES([-I\"`\${CYGPATH} \${srcdir}/generic`\"])
TEA_ADD_LIBS([])
TEA_ADD_CFLAGS([-DSQLITE_ENABLE_FTS3=1])
#TEA_ADD_CFLAGS([-DSQLITE_3_SUFFIX_ONLY=1])
TEA_ADD_STUB_SOURCES([])
TEA_ADD_TCL_SOURCES([])
#--------------------------------------------------------------------
# The --with-system-sqlite causes the TCL bindings to SQLite to use
# the system shared library for SQLite rather than statically linking
# against its own private copy. This is dangerous and leads to
# undersirable dependences and is not recommended.
# Patchs from rmax.
#--------------------------------------------------------------------
AC_ARG_WITH([system-sqlite],
[AC_HELP_STRING([--with-system-sqlite],
[use a system-supplied libsqlite3 instead of the bundled one])],
[], [with_system_sqlite=no])
if test x$with_system_sqlite != xno; then
AC_CHECK_HEADER([sqlite3.h],
[AC_CHECK_LIB([sqlite3],[sqlite3_initialize],
[AC_DEFINE(USE_SYSTEM_SQLITE)
LIBS="$LIBS -lsqlite3"])])
fi
#--------------------------------------------------------------------
# __CHANGE__
# Choose which headers you need. Extension authors should try very
# hard to only rely on the Tcl public header files. Internal headers
# contain private data structures and are subject to change without
# notice.
# This MUST be called after TEA_LOAD_TCLCONFIG / TEA_LOAD_TKCONFIG
#--------------------------------------------------------------------
TEA_PUBLIC_TCL_HEADERS
#TEA_PRIVATE_TCL_HEADERS
#TEA_PUBLIC_TK_HEADERS
#TEA_PRIVATE_TK_HEADERS
#TEA_PATH_X
#--------------------------------------------------------------------
# Check whether --enable-threads or --disable-threads was given.
# This auto-enables if Tcl was compiled threaded.
#--------------------------------------------------------------------
TEA_ENABLE_THREADS
if test "${TCL_THREADS}" = "1" ; then
AC_DEFINE(SQLITE_THREADSAFE, 1, [Trigger sqlite threadsafe build])
# Not automatically added by Tcl because its assumed Tcl links to them,
# but it may not if it isn't really a threaded build.
TEA_ADD_LIBS([$THREADS_LIBS])
else
AC_DEFINE(SQLITE_THREADSAFE, 0, [Trigger sqlite non-threadsafe build])
fi
#--------------------------------------------------------------------
# The statement below defines a collection of symbols related to
# building as a shared library instead of a static library.
#--------------------------------------------------------------------
TEA_ENABLE_SHARED
#--------------------------------------------------------------------
# This macro figures out what flags to use with the compiler/linker
# when building shared/static debug/optimized objects. This information
# can be taken from the tclConfig.sh file, but this figures it all out.
#--------------------------------------------------------------------
TEA_CONFIG_CFLAGS
#--------------------------------------------------------------------
# Set the default compiler switches based on the --enable-symbols option.
#--------------------------------------------------------------------
TEA_ENABLE_SYMBOLS
#--------------------------------------------------------------------
# Everyone should be linking against the Tcl stub library. If you
# can't for some reason, remove this definition. If you aren't using
# stubs, you also need to modify the SHLIB_LD_LIBS setting below to
# link against the non-stubbed Tcl library. Add Tk too if necessary.
#--------------------------------------------------------------------
AC_DEFINE(USE_TCL_STUBS, 1, [Use Tcl stubs])
#AC_DEFINE(USE_TK_STUBS, 1, [Use Tk stubs])
#--------------------------------------------------------------------
# Redefine fdatasync as fsync on systems that lack fdatasync
#--------------------------------------------------------------------
AC_CHECK_FUNC(fdatasync, , AC_DEFINE(fdatasync, fsync))
AC_FUNC_STRERROR_R
#--------------------------------------------------------------------
# This macro generates a line to use when building a library. It
# depends on values set by the TEA_ENABLE_SHARED, TEA_ENABLE_SYMBOLS,
# and TEA_LOAD_TCLCONFIG macros above.
#--------------------------------------------------------------------
TEA_MAKE_LIB
#--------------------------------------------------------------------
# Determine the name of the tclsh and/or wish executables in the
# Tcl and Tk build directories or the location they were installed
# into. These paths are used to support running test cases only,
# the Makefile should not be making use of these paths to generate
# a pkgIndex.tcl file or anything else at extension build time.
#--------------------------------------------------------------------
TEA_PROG_TCLSH
#TEA_PROG_WISH
#--------------------------------------------------------------------
# Finally, substitute all of the various values into the Makefile.
# You may alternatively have a special pkgIndex.tcl.in or other files
# which require substituting th AC variables in. Include these here.
#--------------------------------------------------------------------
AC_OUTPUT([Makefile pkgIndex.tcl])
.TH sqlite3 n 4.1 "Tcl-Extensions"
.HS sqlite3 tcl
.BS
.SH NAME
sqlite3 \- an interface to the SQLite3 database engine
.SH SYNOPSIS
\fBsqlite3\fI command_name ?filename?\fR
.br
.SH DESCRIPTION
SQLite3 is a self-contains, zero-configuration, transactional SQL database
engine. This extension provides an easy to use interface for accessing
SQLite database files from Tcl.
.PP
For full documentation see http://www.sqlite.org/ and
in particular http://www.sqlite.org/tclsqlite.html.
This source diff could not be displayed because it is too large. You can view the blob instead.
The author disclaims copyright to this source code. In place of
a legal notice, here is a blessing:
May you do good and not evil.
May you find forgiveness for yourself and forgive others.
May you share freely, never taking more than you give.
#
# Tcl package index file
#
# Note sqlite*3* init specifically
#
package ifneeded sqlite3 @PACKAGE_VERSION@ \
[list load [file join $dir @PKG_LIB_FILE@] Sqlite3]
#!/bin/sh
#
# install - install a program, script, or datafile
# This comes from X11R5; it is not part of GNU.
#
# $XConsortium: install.sh,v 1.2 89/12/18 14:47:22 jim Exp $
#
# This script is compatible with the BSD install script, but was written
# from scratch.
#
# set DOITPROG to echo to test this script
# Don't use :- since 4.3BSD and earlier shells don't like it.
doit="${DOITPROG-}"
# put in absolute paths if you don't have them in your path; or use env. vars.
mvprog="${MVPROG-mv}"
cpprog="${CPPROG-cp}"
chmodprog="${CHMODPROG-chmod}"
chownprog="${CHOWNPROG-chown}"
chgrpprog="${CHGRPPROG-chgrp}"
stripprog="${STRIPPROG-strip}"
rmprog="${RMPROG-rm}"
instcmd="$mvprog"
chmodcmd=""
chowncmd=""
chgrpcmd=""
stripcmd=""
rmcmd="$rmprog -f"
mvcmd="$mvprog"
src=""
dst=""
while [ x"$1" != x ]; do
case $1 in
-c) instcmd="$cpprog"
shift
continue;;
-m) chmodcmd="$chmodprog $2"
shift
shift
continue;;
-o) chowncmd="$chownprog $2"
shift
shift
continue;;
-g) chgrpcmd="$chgrpprog $2"
shift
shift
continue;;
-s) stripcmd="$stripprog"
shift
continue;;
*) if [ x"$src" = x ]
then
src=$1
else
dst=$1
fi
shift
continue;;
esac
done
if [ x"$src" = x ]
then
echo "install: no input file specified"
exit 1
fi
if [ x"$dst" = x ]
then
echo "install: no destination specified"
exit 1
fi
# If destination is a directory, append the input filename; if your system
# does not like double slashes in filenames, you may need to add some logic
if [ -d $dst ]
then
dst="$dst"/`basename $src`
fi
# Make a temp file name in the proper directory.
dstdir=`dirname $dst`
dsttmp=$dstdir/#inst.$$#
# Move or copy the file name to the temp name
$doit $instcmd $src $dsttmp
# and set any options; do chmod last to preserve setuid bits
if [ x"$chowncmd" != x ]; then $doit $chowncmd $dsttmp; fi
if [ x"$chgrpcmd" != x ]; then $doit $chgrpcmd $dsttmp; fi
if [ x"$stripcmd" != x ]; then $doit $stripcmd $dsttmp; fi
if [ x"$chmodcmd" != x ]; then $doit $chmodcmd $dsttmp; fi
# Now rename the file to the real destination.
$doit $rmcmd $dst
$doit $mvcmd $dsttmp $dst
exit 0
This source diff could not be displayed because it is too large. You can view the blob instead.
var path = require('path');
var Binary = function(options) {
var options = options || {};
var package_json = options.package_json || require('../package.json');
this.name = options.name || 'binding';
this.configuration = options.configuration || 'Release';
this.uri = options.uri || 'http://'+this.name+'.s3.amazonaws.com/';
this.module_maj_min = package_json.version.split('.').slice(0,2).join('.');
this.module_abi = package_json.abi;
this.platform = options.platform || process.platform;
this.target_arch = options.target_arch || process.arch;
if (process.versions.modules) {
// added in >= v0.10.4 and v0.11.7
// https://github.com/joyent/node/commit/ccabd4a6fa8a6eb79d29bc3bbe9fe2b6531c2d8e
this.node_abi = 'node-v' + (+process.versions.modules);
} else {
this.node_abi = 'v8-' + process.versions.v8.split('.').slice(0,2).join('.');
}
}
Binary.prototype.filename = function() {
return this.name + '.node';
}
Binary.prototype.compression = function() {
return '.tar.gz';
}
Binary.prototype.getBasePath = function() {
return this.node_abi
+ '-' + this.platform
+ '-' + this.target_arch;
}
Binary.prototype.getRequirePath = function(configuration) {
return './' + path.join('binding',
configuration || this.configuration,
this.getBasePath(),
this.filename());
}
Binary.prototype.getModuleAbi = function() {
return this.name + '-v' + this.module_maj_min + '.' + this.module_abi;
}
Binary.prototype.getArchivePath = function() {
return this.getModuleAbi()
+ '-'
+ this.getBasePath()
+ this.compression();
}
Binary.prototype.getRemotePath = function() {
return this.uri+this.configuration+'/'+this.getArchivePath();
}
module.exports.Binary = Binary;
\ No newline at end of file
var sqlite3 = module.exports = exports = require('../build/Release/node_sqlite3.node');
var Binary = require('./binary_name.js').Binary;
var binary = new Binary({name:'node_sqlite3'});
var binding;
try {
binding = require(binary.getRequirePath('Debug'));
} catch (err) {
binding = require(binary.getRequirePath('Release'));
}
var sqlite3 = module.exports = exports = binding;
var path = require('path');
var util = require('util');
var EventEmitter = require('events').EventEmitter;
......
{
"name": "sqlite3",
"description": "Asynchronous, non-blocking SQLite3 bindings",
"version": "2.1.7",
"version": "2.1.17",
"abi":"a",
"homepage": "http://github.com/developmentseed/node-sqlite3",
"author": {
"name": "Development Seed",
......@@ -22,21 +23,26 @@
"Audrius Kažukauskas <audrius@neutrino.lt>",
"Johannes Schauer <josch@pyneo.org>",
"Nathan Rajlich <nathan@tootallnate.net>",
"AJ ONeal <coolaj86@gmail.com>"
"AJ ONeal <coolaj86@gmail.com>",
"Mithgol"
],
"repository": {
"type": "git",
"url": "git://github.com/developmentseed/node-sqlite3.git"
},
"devDependencies": {
"step": "0.0.4",
"mocha": "~1.7"
"dependencies": {
"progress":"~1.0.1",
"mkdirp":"~0.3.5",
"tar.gz": "~0.1.1"
},
"bundledDependencies":["mkdirp","tar.gz","progress"],
"engines": {
"node": ">= 0.6.13 && < 0.11.0"
"node": ">= 0.6.13 < 0.11.0"
},
"scripts": {
"test": "mocha -R spec"
"install": "node build.js",
"pretest": "node test/support/createdb.js",
"test": "mocha -R spec --timeout 200000"
},
"licenses": [{ "type": "BSD" }],
"main": "./lib/sqlite3"
......
......@@ -50,7 +50,6 @@ public:
assert(handle->data != NULL);
Async* async = static_cast<Async*>(handle->data);
delete async;
handle->data = NULL;
}
void finish() {
......
#include <string.h>
#include <node.h>
#include <node_buffer.h>
#include <node_version.h>
#include "macros.h"
#include "database.h"
......@@ -190,9 +191,6 @@ template <class T> Values::Field*
else if (source->IsDate()) {
return new Values::Float(pos, source->NumberValue());
}
else if (source->IsUndefined()) {
return NULL;
}
else {
return NULL;
}
......@@ -260,7 +258,7 @@ bool Statement::Bind(const Parameters parameters) {
Parameters::const_iterator it = parameters.begin();
Parameters::const_iterator end = parameters.end();
for (; it < end; it++) {
for (; it < end; ++it) {
Values::Field* field = *it;
if (field != NULL) {
......@@ -544,7 +542,7 @@ void Statement::Work_AfterAll(uv_work_t* req) {
Local<Array> result(Array::New(baton->rows.size()));
Rows::const_iterator it = baton->rows.begin();
Rows::const_iterator end = baton->rows.end();
for (int i = 0; it < end; it++, i++) {
for (int i = 0; it < end; ++it, i++) {
result->Set(i, RowToJS(*it));
delete *it;
}
......@@ -647,7 +645,6 @@ void Statement::CloseCallback(uv_handle_t* handle) {
assert(handle->data != NULL);
Async* async = static_cast<Async*>(handle->data);
delete async;
handle->data = NULL;
}
void Statement::AsyncEach(uv_async_t* handle, int status) {
......@@ -671,7 +668,7 @@ void Statement::AsyncEach(uv_async_t* handle, int status) {
Rows::const_iterator it = rows.begin();
Rows::const_iterator end = rows.end();
for (int i = 0; it < end; it++, i++) {
for (int i = 0; it < end; ++it, i++) {
argv[1] = RowToJS(*it);
async->retrieved++;
TRY_CATCH_CALL(async->stmt->handle_, async->item_cb, 2, argv);
......@@ -745,7 +742,7 @@ Local<Object> Statement::RowToJS(Row* row) {
Row::const_iterator it = row->begin();
Row::const_iterator end = row->end();
for (int i = 0; it < end; it++, i++) {
for (int i = 0; it < end; ++it, i++) {
Values::Field* field = *it;
Local<Value> value;
......@@ -761,8 +758,11 @@ Local<Object> Statement::RowToJS(Row* row) {
value = Local<Value>(String::New(((Values::Text*)field)->value.c_str(), ((Values::Text*)field)->value.size()));
} break;
case SQLITE_BLOB: {
Buffer *buffer = Buffer::New(((Values::Blob*)field)->value, ((Values::Blob*)field)->length);
value = Local<Value>::New(buffer->handle_);
#if NODE_VERSION_AT_LEAST(0, 11, 3)
value = Local<Value>::New(Buffer::New(((Values::Blob*)field)->value, ((Values::Blob*)field)->length));
#else
value = Local<Value>::New(Buffer::New(((Values::Blob*)field)->value, ((Values::Blob*)field)->length)->handle_);
#endif
} break;
case SQLITE_NULL: {
value = Local<Value>::New(Null());
......
......@@ -195,11 +195,6 @@ public:
if (!finalized) Finalize();
}
protected:
static void Work_BeginPrepare(Database::Baton* baton);
static void Work_Prepare(uv_work_t* req);
static void Work_AfterPrepare(uv_work_t* req);
WORK_DEFINITION(Bind);
WORK_DEFINITION(Get);
WORK_DEFINITION(Run);
......@@ -207,10 +202,16 @@ protected:
WORK_DEFINITION(Each);
WORK_DEFINITION(Reset);
static Handle<Value> Finalize(const Arguments& args);
protected:
static void Work_BeginPrepare(Database::Baton* baton);
static void Work_Prepare(uv_work_t* req);
static void Work_AfterPrepare(uv_work_t* req);
static void AsyncEach(uv_async_t* handle, int status);
static void CloseCallback(uv_handle_t* handle);
static Handle<Value> Finalize(const Arguments& args);
static void Finalize(Baton* baton);
void Finalize();
......
......@@ -8,11 +8,11 @@
#define NODE_SQLITE3_MUTEX_t HANDLE mutex;
#define NODE_SQLITE3_MUTEX_INIT CreateMutex(NULL, FALSE, NULL);
#define NODE_SQLITE3_MUTEX_INIT mutex = CreateMutex(NULL, FALSE, NULL);
#define NODE_SQLITE3_MUTEX_LOCK(m) WaitForSingleObject(m, INFINITE);
#define NODE_SQLITE3_MUTEX_LOCK(m) WaitForSingleObject(*m, INFINITE);
#define NODE_SQLITE3_MUTEX_UNLOCK(m) ReleaseMutex(m);
#define NODE_SQLITE3_MUTEX_UNLOCK(m) ReleaseMutex(*m);
#define NODE_SQLITE3_MUTEX_DESTROY CloseHandle(mutex);
......
......@@ -3,6 +3,10 @@ var assert = require('assert');
var helper = require('./support/helper');
describe('cache', function() {
before(function() {
helper.ensureExists('test/tmp');
});
it('should cache Database objects while opening', function(done) {
var filename = 'test/tmp/test_cache.db';
helper.deleteFile(filename);
......
......@@ -2,6 +2,10 @@ var sqlite3 = require('..');
var assert = require('assert');
var exists = require('fs').existsSync || require('path').existsSync;
/*
// disabled because this is not a generically safe test to run on all systems
var spatialite_ext = '/usr/local/lib/libspatialite.dylib';
describe('loadExtension', function(done) {
......@@ -18,3 +22,5 @@ describe('loadExtension', function(done) {
it('libspatialite');
}
});
*/
\ No newline at end of file
......@@ -7,6 +7,7 @@ describe('null error', function() {
var db;
before(function(done) {
helper.ensureExists('test/tmp');
helper.deleteFile(filename);
db = new sqlite3.Database(filename, done);
});
......
......@@ -4,6 +4,10 @@ var fs = require('fs');
var helper = require('./support/helper');
describe('open/close', function() {
before(function() {
helper.ensureExists('test/tmp');
});
describe('open and close non-existant database', function() {
before(function() {
helper.deleteFile('test/tmp/test_create.db');
......@@ -27,10 +31,10 @@ describe('open/close', function() {
});
});
it('should be unable to open an inaccessible database', function(done) {
it('should not be unable to open an inaccessible database', function(done) {
// NOTE: test assumes that the user is not allowed to create new files
// in /usr/bin.
var db = new sqlite3.Database('/usr/bin/test.db', function(err) {
var db = new sqlite3.Database('/test/tmp/directory-does-not-exist/test.db', function(err) {
if (err && err.errno === sqlite3.CANTOPEN) {
done();
} else if (err) {
......
......@@ -17,21 +17,25 @@ describe('data types', function() {
it('should serialize Date()', function(done) {
var date = new Date();
db.run("INSERT INTO int_table VALUES(?)", date);
db.get("SELECT int FROM int_table", function(err, row) {
db.run("INSERT INTO int_table VALUES(?)", date, function (err) {
if (err) throw err;
assert.equal(row.int, +date);
done();
db.get("SELECT int FROM int_table", function(err, row) {
if (err) throw err;
assert.equal(row.int, +date);
done();
});
});
});
it('should serialize RegExp()', function(done) {
var regexp = /^f\noo/;
db.run("INSERT INTO txt_table VALUES(?)", regexp);
db.get("SELECT txt FROM txt_table", function(err, row) {
db.run("INSERT INTO txt_table VALUES(?)", regexp, function (err) {
if (err) throw err;
assert.equal(row.txt, String(regexp));
done();
db.get("SELECT txt FROM txt_table", function(err, row) {
if (err) throw err;
assert.equal(row.txt, String(regexp));
done();
});
});
});
......@@ -49,11 +53,13 @@ describe('data types', function() {
-Infinity
].forEach(function(flt) {
it('should serialize float ' + flt, function(done) {
db.run("INSERT INTO flt_table VALUES(?)", flt);
db.get("SELECT flt FROM flt_table", function(err, row) {
db.run("INSERT INTO flt_table VALUES(?)", flt, function (err) {
if (err) throw err;
assert.equal(row.flt, flt);
done();
db.get("SELECT flt FROM flt_table", function(err, row) {
if (err) throw err;
assert.equal(row.flt, flt);
done();
});
});
});
});
......@@ -70,11 +76,13 @@ describe('data types', function() {
-Infinity
].forEach(function(integer) {
it('should serialize integer ' + integer, function(done) {
db.run("INSERT INTO int_table VALUES(?)", integer);
db.get("SELECT int AS integer FROM int_table", function(err, row) {
db.run("INSERT INTO int_table VALUES(?)", integer, function (err) {
if (err) throw err;
assert.equal(row.integer, integer);
done();
db.get("SELECT int AS integer FROM int_table", function(err, row) {
if (err) throw err;
assert.equal(row.integer, integer);
done();
});
});
});
});
......
......@@ -6,6 +6,7 @@ describe('parallel', function() {
var db;
before(function(done) {
helper.deleteFile('test/tmp/test_parallel_inserts.db');
helper.ensureExists('test/tmp');
db = new sqlite3.Database('test/tmp/test_parallel_inserts.db', done);
});
......
#!/usr/bin/env node
var existsSync = require('fs').existsSync || require('path').existsSync;
var path = require('path');
var sqlite3 = require('../../lib/sqlite3');
var count = 1000000;
var db_path = path.join(__dirname,'big.db');
function randomString() {
var str = '';
var chars = 'abcdefghijklmnopqrstuvwxzyABCDEFGHIJKLMNOPQRSTUVWXZY0123456789 ';
......@@ -9,18 +17,20 @@ function randomString() {
return str;
};
var db = new sqlite3.Database('test/support/big.db');
var count = 10000000;
db.serialize(function() {
db.run("CREATE TABLE foo (id INT, txt TEXT)");
db.run("BEGIN TRANSACTION");
var stmt = db.prepare("INSERT INTO foo VALUES(?, ?)");
for (var i = 0; i < count; i++) {
stmt.run(i, randomString());
}
stmt.finalize();
db.run("COMMIT TRANSACTION");
});
if (existsSync(db_path)) {
console.log('okay: database already created (' + db_path + ')');
} else {
console.log("Creating test database... This may take several minutes.");
var db = new sqlite3.Database(db_path);
db.serialize(function() {
db.run("CREATE TABLE foo (id INT, txt TEXT)");
db.run("BEGIN TRANSACTION");
var stmt = db.prepare("INSERT INTO foo VALUES(?, ?)");
for (var i = 0; i < count; i++) {
stmt.run(i, randomString());
}
stmt.finalize();
db.run("COMMIT TRANSACTION");
});
}
var assert = require('assert');
var fs = require('fs');
var pathExists = require('fs').existsSync || require('path').existsSync;
exports.deleteFile = function(name) {
try {
......@@ -11,6 +12,12 @@ exports.deleteFile = function(name) {
}
};
exports.ensureExists = function(name,cb) {
if (!pathExists(name)) {
fs.mkdirSync(name);
};
}
assert.fileDoesNotExist = function(name) {
try {
fs.statSync(name);
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment