Logistic回归(实例)

来源:互联网 发布:一些计算机算法的例子 编辑:程序博客网 时间:2024/06/05 17:40

这节我们通过一个实例来进行一下算法测试:Text.txt下载  (提取码:3b8f)  Train.txt下载  (提取码:d947)

def classifyVector(inX, weights):    prob = sigmoid(sum(inX*weights))    if prob > 0.5: return 1.0    else: return 0.0def colicTest():    frTrain = open('horseColicTraining.txt'); frTest = open('horseColicTest.txt')    trainingSet = []; trainingLabels = []    for line in frTrain.readlines():        currLine = line.strip().split('\t')        lineArr =[]        for i in range(21):            lineArr.append(float(currLine[i]))        trainingSet.append(lineArr)        trainingLabels.append(float(currLine[21]))    trainWeights = stocGradAscent1(array(trainingSet), trainingLabels, 1000)    errorCount = 0; numTestVec = 0.0    for line in frTest.readlines():        numTestVec += 1.0        currLine = line.strip().split('\t')        lineArr =[]        for i in range(21):            lineArr.append(float(currLine[i]))        if int(classifyVector(array(lineArr), trainWeights))!= int(currLine[21]):            errorCount += 1    errorRate = (float(errorCount)/numTestVec)    print "the error rate of this test is: %f" % errorRate    return errorRatedef multiTest():    numTests = 10; errorSum=0.0    for k in range(numTests):        errorSum += colicTest()    print "after %d iterations the average error rate is: %f" % (numTests, errorSum/float(numTests))
第一个函数以回归系数和特征向量作为输入来计算Sigmoid的值。第二个函数首先对数据进行处理,然后使用改进的随机梯度上升算法来计算回归系数然后进行预测并计算出错误率,最后一个函数是进行10次迭代计算平均的错误率。

0 0
原创粉丝点击