SQLAlchemy批量插入性能比较
来源:互联网 发布:js input框不可修改 编辑:程序博客网 时间:2024/06/05 02:24
import timeimport sqlite3from sqlalchemy.ext.declarative import declarative_basefrom sqlalchemy import Column, Integer, String, create_enginefrom sqlalchemy.orm import scoped_session, sessionmakerBase = declarative_base()DBSession = scoped_session(sessionmaker())engine = Noneclass Customer(Base): __tablename__ = "customer" id = Column(Integer, primary_key=True) name = Column(String(255))def init_sqlalchemy(dbname='sqlite:///sqlalchemy.db'): global engine engine = create_engine(dbname, echo=False) DBSession.remove() DBSession.configure(bind=engine, autoflush=False, expire_on_commit=False) Base.metadata.drop_all(engine) Base.metadata.create_all(engine)def test_sqlalchemy_orm(n=100000): init_sqlalchemy() t0 = time.time() for i in xrange(n): customer = Customer() customer.name = 'NAME ' + str(i) DBSession.add(customer) if i % 1000 == 0: DBSession.flush() DBSession.commit() print( "SQLAlchemy ORM: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_orm_pk_given(n=100000): init_sqlalchemy() t0 = time.time() for i in xrange(n): customer = Customer(id=i+1, name="NAME " + str(i)) DBSession.add(customer) if i % 1000 == 0: DBSession.flush() DBSession.commit() print( "SQLAlchemy ORM pk given: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_orm_bulk_save_objects(n=100000): init_sqlalchemy() t0 = time.time() n1 = n while n1 > 0: n1 = n1 - 10000 DBSession.bulk_save_objects( [ Customer(name="NAME " + str(i)) for i in xrange(min(10000, n1)) ] ) DBSession.commit() print( "SQLAlchemy ORM bulk_save_objects(): Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_orm_bulk_insert(n=100000): init_sqlalchemy() t0 = time.time() n1 = n while n1 > 0: DBSession.bulk_insert_mappings( Customer, [ dict(name="NAME " + str(i)) for i in xrange(min(10000, n1)) ] ) n1 = n1 - 10000 DBSession.commit() print( "SQLAlchemy ORM bulk_insert_mappings(): Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def test_sqlalchemy_core(n=100000): init_sqlalchemy() t0 = time.time() engine.execute( Customer.__table__.insert(), [{"name": 'NAME ' + str(i)} for i in xrange(n)] ) ##==> engine.execute('insert into ttable (name) values ("NAME"), ("NAME2")') print( "SQLAlchemy Core: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs")def init_sqlite3(dbname): conn = sqlite3.connect(dbname) c = conn.cursor() c.execute("DROP TABLE IF EXISTS customer") c.execute( "CREATE TABLE customer (id INTEGER NOT NULL, " "name VARCHAR(255), PRIMARY KEY(id))") conn.commit() return conndef test_sqlite3(n=100000, dbname='sqlite3.db'): conn = init_sqlite3(dbname) c = conn.cursor() t0 = time.time() for i in xrange(n): row = ('NAME ' + str(i),) c.execute("INSERT INTO customer (name) VALUES (?)", row) conn.commit() print( "sqlite3: Total time for " + str(n) + " records " + str(time.time() - t0) + " sec")if __name__ == '__main__': test_sqlalchemy_orm(100000) test_sqlalchemy_orm_pk_given(100000) test_sqlalchemy_orm_bulk_save_objects(100000) test_sqlalchemy_orm_bulk_insert(100000) test_sqlalchemy_core(100000) test_sqlite3(100000)
上面代码分别使用了orm, orm带主键,orm的bulk_save_objects, orm的bulk_insert_mappings, 非orm形式,原生的dbapi方式;插入10000条记录的结果如下:
SQLAlchemy ORM: Total time for 100000 records 7.2070479393 secsSQLAlchemy ORM pk given: Total time for 100000 records 4.28471207619 secsSQLAlchemy ORM bulk_save_objects(): Total time for 100000 records 1.58296084404 secsSQLAlchemy ORM bulk_insert_mappings(): Total time for 100000 records 0.453973054886 secsSQLAlchemy Core: Total time for 100000 records 0.210998058319 secssqlite3: Total time for 100000 records 0.136252880096 sec
注意:
这里只有第一、二种方式可以拿到插入记录的id,其它都不能直接拿到,需要自己再查询一次。
原文连接:http://docs.sqlalchemy.org/en/latest/faq/performance.html
0 0
- SQLAlchemy批量插入性能比较
- SQLAlchemy批量插入性能比较
- SQLAlchemy批量插入性能比较
- SQLAlchemy 插入 性能 速度 比较
- DataTable批量插入数据库 性能比较
- PHP 批量插入数据三种方法性能比较
- .NET批量大数据插入性能分析及比较(4.使用DataAdapter批量插入)
- .NET批量大数据插入性能分析及比较(4.使用DataAdapter批量插入)
- sqlite 批量插入性能
- .NET批量大数据插入性能分析及比较(2.普通插入与拼接sql批量插入)
- .NET批量大数据插入性能分析及比较(2.普通插入与拼接sql批量插入)
- MySQL批量插入性能优化
- 批量SQL插入性能优化
- Mycat批量插入性能测试
- .NET批量大数据插入性能分析及比较(1.准备工作)
- .NET批量大数据插入性能分析及比较(3.使用事务)
- .NET批量大数据插入性能分析及比较(5.使用SqlBulkCopy)
- .NET批量大数据插入性能分析及比较(6.使用表值参数)
- 学习OpenCV---阈值化
- Linux清理cached缓存
- Java流程控制的陷阱——if语句和循环体的陷阱
- assert()函数用法总结
- MySQL学习笔记 4:自定义函数和MySQL存储过程
- SQLAlchemy批量插入性能比较
- windows环境布置bttracker服务器和下载分享
- Java web service 实例
- vue 开发过程的问题
- JDK1.8 JVM生产环境参数配置
- 在JSP里使用CKEditor和CKFinder
- 历届试题 核桃的数量(蓝桥杯)
- 客户端与服务器之间分别通过TCP和UDP进行通信
- 程序设计点滴