Python2.7学习

来源:互联网 发布:java 界面开发 编辑:程序博客网 时间:2024/05/21 07:12

网上很多代码都不适用于python3版本,所以还是转回版本2来学习了


install 安装模块特别简单

E:\01_SOFT\Python27\python  -m easy_install sunburnt
E:\01_SOFT\Python27\python  -m easy_install lxml
E:\01_SOFT\Python27\python  -m easy_install requests
Microsoft Visual C++ Compiler for Python 2.7 
http://www.microsoft.com/en-us/download/confirmation.aspx?id=44266


beautifulsoup4
cd E:\01_SOFT\beautifulsoup4-4.3.2\beautifulsoup4-4.3.2
python setup.py install


mongodb安装
http://blog.csdn.net/t_ells/article/details/50265889
E:\01_SOFT\mongodb\mongodb-win32-x86_64-2.2.0\bin\mongod --dbpath "E:\01_SOFT\mongodb\data"


pymongo
https://pypi.python.org/pypi/pymongo/
安装$ E:\01_SOFT\Python27\python  -m easy_install pymongo


download
https://www.python.org/downloads/release/python-352/


python实现简单爬虫功能
http://www.cnblogs.com/fnng/p/3576154.html


Python模拟百度登录实例详解
http://www.jb51.net/article/78406.htm


时间戳转换工具
http://tool.lu/timestamp


使用Python解析JSON数据的基本方法
http://www.jb51.net/article/73450.htm


Unicode编码转换
http://tool.chinaz.com/tools/unicode.aspx


Json在线解析
http://www.bejson.com/jsonviewernew/


软件下载
https://pan.baidu.com/share/home?uk=2466540631#category/type=0


Python爬虫入门六之Cookie的使用
http://cuiqingcai.com/968.html


Cookies and CookieJar 
https://bytes.com/topic/python/answers/802534-cookies-cookiejar


Python实战计划学习作业2-1
http://blog.csdn.net/python012/article/details/53344501


1.关于api-ms-win-crt-runtimel1-1-0.dll缺失的解决方案 
https://www.microsoft.com/zh-cn/download/confirmation.aspx?id=48145


2.can't use a string pattern on a bytes-like object 
imglist = re.findall(imgre,html.decode('GBK'))


3.inconsistent use of tabs and space in indentation
把tab替换成空格


4.UnicodeDecodeError:'gbk' codec can't decode byte 0xaf in position 197:illegal multibyte sequence
html.decode('utf-8')


5.Missing parentheses in call to 'print'
print x 
改成print(x)


6.IndentationError:expected an indented block错误解决 
print前面空格不同代表不同层次
对缩进非常敏感,有冒号的下一行往往要缩进,该缩进就缩进


7.Python EOL while scanning string literal问题解决方法
引号没有成对出现


8.Beautifulsoup BS4
https://pypi.python.org/pypi/beautifulsoup4/4.3.2


cd E:\01_SOFT\beautifulsoup4-4.3.2\beautifulsoup4-4.3.2
python setup.py install


soup = BeautifulSoup(open('index.html'))
print soup.prettify()


list = soup.findAll('a');
list = soup.findAll(name='a',href=re.compile(r"kw=")) ;
list = soup.findAll(name='a',attrs={'href':re.compile(r"kw="),'title':re.compile(r".")}) ;


9.保存网页,百度的不成功
import urllib
def cbk(a, b, c):  
'''回调函数 
@a: 已经下载的数据块 
@b: 数据块的大小 
@c: 远程文件的大小 
'''  
per = 100.0 * a * b / c  
if per > 100:  
per = 100  
print '%.2f%%' % per 
urllib.urlretrieve('http://www.cmfish.com/bbs/forum.php','D:\\06_Download\\py\\baidu1.html',cbk);


10.保存字符串
def save(filename, contents): 
  fh = open(filename, 'w') 
  fh.write(contents) 
  fh.close() 
save('D:\\06_Download\\py\\baidu.html', content) 


11.UnicodeEncodeError: ‘gbk‘ codec can‘t encode character u‘\xa9‘ in position 24051: illegal multibyte sequence
source_code.encode(‘GB18030‘)
not work!


12.TypeError: coercing to Unicode: need string or buffer, type found
??


13.'ascii' codec can't encode characters in position 4-7:ordinal not in range(128)
str('中文')???


14.invalid mode('w') or filename
文件路径错误
f = file("E:\json.txt",'w')
改成  
f = file("E:\\json.txt",'w')  


15. No JSON object could be decoded
expected string or buffe
pythod invalid file mode or filename
>将文件保存为不带BOM的UTF-8格式


import json
from pymongo import MongoClient
client = MongoClient('127.0.0.1', 27017)
db = client["Collections"]#数据库名
table=db['user']#表名
table.insert({'id':'1','name':'cnki'})
f = file("D:\\json.txt")  
j = json.loads(f.read()); 
table=db['jsontxt']#表名
courseId = table.save(j)


Python中解析Json文件出错:ValueError : No JSON object could be decoded –> Python中Json库不支持带BOM的UTF-8
http://www.crifan.com/fixed_problem_for_python_valueerror_no_json_object_could_be_decoded/


16.Non-ASCII character '\xe5' in file ……

原因:Python默认是以ASCII作为编码方式的,如果在自己的Python源码中包含了中文(或者其他非英语系的语言),此时即使你把自己编写的Python源文件以UTF-8格式保存了,但实际上,这依然是不行的。

需要在文件前面加

#coding=utf-8


17.unexpected unident
空格或tab键对齐
0 0
原创粉丝点击