SQLAlchemy批量插入性能比较
来源:互联网 发布:linux判断文本内容换行 编辑:程序博客网 时间:2024/06/04 23:20
import timeimport sqlite3from sqlalchemy.ext.declarative import declarative_basefrom sqlalchemy import Column, Integer, String, create_enginefrom sqlalchemy.orm import scoped_session, sessionmakerBase = declarative_base()DBSession = scoped_session(sessionmaker())engine = Noneclass Customer(Base): __tablename__ = "customer" id = Column(Integer, primary_key=True) name = Column(String(255))def init_sqlalchemy(dbname='sqlite:///sqlalchemy.db'): global engine engine = create_engine(dbname, echo=False) DBSession.remove() DBSession.configure(bind=engine, autoflush=False, expire_on_commit=False) Base.metadata.drop_all(engine) Base.metadata.create_all(engine)def test_sqlalchemy_orm(n=100000): init_sqlalchemy() t0 = time.time() for i in xrange(n): customer = Customer() customer.name = 'NAME ' + str(i) DBSession.add(customer) if i % 1000 == 0: DBSession.flush() DBSession.commit() print( "SQLAlchemy ORM: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_orm_pk_given(n=100000): init_sqlalchemy() t0 = time.time() for i in xrange(n): customer = Customer(id=i+1, name="NAME " + str(i)) DBSession.add(customer) if i % 1000 == 0: DBSession.flush() DBSession.commit() print( "SQLAlchemy ORM pk given: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_orm_bulk_save_objects(n=100000): init_sqlalchemy() t0 = time.time() n1 = n while n1 > 0: n1 = n1 - 10000 DBSession.bulk_save_objects( [ Customer(name="NAME " + str(i)) for i in xrange(min(10000, n1)) ] ) DBSession.commit() print( "SQLAlchemy ORM bulk_save_objects(): Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_orm_bulk_insert(n=100000): init_sqlalchemy() t0 = time.time() n1 = n while n1 > 0: DBSession.bulk_insert_mappings( Customer, [ dict(name="NAME " + str(i)) for i in xrange(min(10000, n1)) ] ) n1 = n1 - 10000 DBSession.commit() print( "SQLAlchemy ORM bulk_insert_mappings(): Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_core(n=100000): init_sqlalchemy() t0 = time.time() engine.execute( Customer.__table__.insert(), [{"name": 'NAME ' + str(i)} for i in xrange(n)] ) ##==> engine.execute('insert into ttable (name) values ("NAME"), ("NAME2")') print( "SQLAlchemy Core: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def init_sqlite3(dbname): conn = sqlite3.connect(dbname) c = conn.cursor() c.execute("DROP TABLE IF EXISTS customer") c.execute( "CREATE TABLE customer (id INTEGER NOT NULL, " "name VARCHAR(255), PRIMARY KEY(id))") conn.commit() return conndef test_sqlite3(n=100000, dbname='sqlite3.db'): conn = init_sqlite3(dbname) c = conn.cursor() t0 = time.time() for i in xrange(n): row = ('NAME ' + str(i),) c.execute("INSERT INTO customer (name) VALUES (?)", row) conn.commit() print( "sqlite3: Total time for " + str(n) + " records " + str(time.time() - t0) + " sec")if __name__ == '__main__': test_sqlalchemy_orm(100000) test_sqlalchemy_orm_pk_given(100000) test_sqlalchemy_orm_bulk_save_objects(100000) test_sqlalchemy_orm_bulk_insert(100000) test_sqlalchemy_core(100000) test_sqlite3(100000)
上面代码分别使用了orm, orm带主键,orm的bulk_save_objects, orm的bulk_insert_mappings, 非orm形式,原生的dbapi方式;插入10000条记录的结果如下:
SQLAlchemy ORM: Total time for 100000 records 7.2070479393 secsSQLAlchemy ORM pk given: Total time for 100000 records 4.28471207619 secsSQLAlchemy ORM bulk_save_objects(): Total time for 100000 records 1.58296084404 secsSQLAlchemy ORM bulk_insert_mappings(): Total time for 100000 records 0.453973054886 secsSQLAlchemy Core: Total time for 100000 records 0.210998058319 secssqlite3: Total time for 100000 records 0.136252880096 sec注意:
这里只有第一、二种方式可以拿到插入记录的id,其它都不能直接拿到,需要自己再查询一次。
原文连接:http://docs.sqlalchemy.org/en/latest/faq/performance.html
阅读全文
0 0
- SQLAlchemy批量插入性能比较
- SQLAlchemy批量插入性能比较
- SQLAlchemy批量插入性能比较
- SQLAlchemy 插入 性能 速度 比较
- DataTable批量插入数据库 性能比较
- PHP 批量插入数据三种方法性能比较
- .NET批量大数据插入性能分析及比较(4.使用DataAdapter批量插入)
- .NET批量大数据插入性能分析及比较(4.使用DataAdapter批量插入)
- sqlite 批量插入性能
- .NET批量大数据插入性能分析及比较(2.普通插入与拼接sql批量插入)
- .NET批量大数据插入性能分析及比较(2.普通插入与拼接sql批量插入)
- MySQL批量插入性能优化
- 批量SQL插入性能优化
- Mycat批量插入性能测试
- .NET批量大数据插入性能分析及比较(1.准备工作)
- .NET批量大数据插入性能分析及比较(3.使用事务)
- .NET批量大数据插入性能分析及比较(5.使用SqlBulkCopy)
- .NET批量大数据插入性能分析及比较(6.使用表值参数)
- 翻转单向链表
- 自动抛出异常的应用
- dvvs
- 第九周项目一__二叉树算法库
- 文件系统VFS数据结构
- SQLAlchemy批量插入性能比较
- NOIP前的狂欢(机房四日颓废)
- win10解决Mysql net start mysql启动,提示发生系统错误 5 拒绝访问
- Shell调起Informatica作业
- android中EditText输入框处理一些特殊符号
- 原来menustrip就接应alt键下的
- Luogu 3383(线性筛)
- 面向对象的三个特点
- Linux基础操作命令