python判断内存泄漏_如何检测python代码中的内存泄漏?我对机器学习和python都是新⼿!我希望我的代码能够预测对象,在我的例⼦中,主要是汽车。
当我启动脚本时,它运⾏得很顺利,但是在20多张图⽚之后,由于内存泄漏,它挂断了我的系统。
我希望这个脚本运⾏到我的整个数据库,这是远远超过20张图⽚。在
我尝试过Pypler tracker来跟踪哪些对象占⽤了最多的内存-
这是我试图运⾏的代码来预测图⽚中的对象:from imageai.Prediction import ImagePrediction
import os
quest
tor
acker import SummaryTracker
tracker = SummaryTracker()
mydb = t(
host="localhost",
user="phpmyadmin",
passwd="anshu",
database="python_test"
)
counter = 0
mycursor = mydb.cursor()
sql = "SELECT id, image_url FROM `used_cars` " \
"WHERE is_processed = '0' AND image_url IS NOT NULL LIMIT 1"
result = mycursor.fetchall()
def dl_img(url, filepath, filename):
fullpath = filepath + filename
for eachfile in result:
id = eachfile[0]
print(id)
filename = "image.jpg"
url = eachfile[1]
filepath = "/home/priyanshu/PycharmProjects/untitled/images/"python新手代码例子
print(filename)
print(url)
print(filepath)
dl_img(url, filepath, filename)
execution_path = "/home/priyanshu/PycharmProjects/untitled/images/"
prediction = ImagePrediction()
prediction.setModelTypeAsResNet()
prediction.setModelPath( os.path.join(execution_path,
"/home/priyanshu/Downloads/resnet50_weights_tf_dim_ordering_tf_kernels.h 5"))
prediction.loadModel()
predictions, probabilities = prediction.predictImage(os.path.join(execution_path, "image.jpg"), result_count=1) for eachPrediction, eachProbability in zip(predictions, probabilities):
per = 0.00
label = ""
print(eachPrediction, " : ", eachProbability)
label = eachPrediction
per = eachProbability
print("Label: " + label)
print("Per:" + str(per))
counter = counter + 1
print("Picture Number: " + str(counter))
sql1 = "UPDATE used_cars SET is_processed = '1' WHERE id = '%s'" % id
sql2 = "INSERT into label (used_car_image_id, object_label, percentage) " \
"VALUE ('%s', '%s', '%s') " % (id, label, per)
print("done")
mydbmit()
tracker.print_diff()
这是我从⼀张图⽚中得到的结果,它在⼀些迭代之后消耗了整个RAM。我该怎么做才能阻⽌泄漏?在
^{pr2}$
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论