unet的tensorflow代码_[深度学习]TensorFlow上实现Unet⽹络TensorFlow Unet安装
确保Tensorflow已安装,如果没有,请参考Tensorflow安装说明 link
安装package
$ cd tf_unet
$ pip install -
$ python setup.py install --user
Package使⽤
在其他⼯程中使⽤Tensorflow Unet的⼀个例⼦
from tf_unet import unet, util, image_util
#preparing data loading
data_provider = image_util.ImageDataProvider("fishes/train/*.tif")
#setup & training
net = unet.Unet(layers=3, features_root=64, channels=1, n_class=2)
trainer = unet.Trainer(net)
path = ain(data_provider, output_path, training_iters=32, epochs=100)
#verification
...
prediction = net.predict(path, data)
<_rate(prediction, p_to_shape(label, prediction.shape))
img = utilbine_img_prediction(data, label, prediction)
util.save_image(img, "prediction.jpg")
可以利⽤Tensorboard跟踪学习的进度。tf_unet输出相关指标信息。
tf_unet Package的⼏个模块(链接有api和source,不再赘述)
unet模块:link
image_util模块:link
image_util模块:link
layers模块:link
⼯程⾃带⽰例程序(都已经亲测)
都是Jupyter notebooks,⽅便学习
Toy问题 link
建议使⽤1.5.0以上版本tensorflow,本⼈在测试这个代码的时候遇到报错AttributeError: 'module' object has no attribute
'softmax_cross_entropy_with_logits_v2',查了之后,发现1.4版本以下没有这个function,如果没有条件安装1.5.0以上,可git checkout 0.1.0切换⾄0.1.0版本
from __future__ import division, print_function
%matplotlib inline
import matplotlib.pyplot as plt
import matplotlib
import numpy as np
from tf_unet import image_gen
from tf_unet import unet
from tf_unet import util
nx = 572
ny = 572GrayScaleDataProvider(nx, ny, cnt=20)
x_test, y_test = generator(1)
fig, ax = plt.subplots(1,2, sharey=True, figsize=(8,4))
ax[0].imshow(x_test[0,...,0], aspect="auto")
ax[1].imshow(y_test[0,...,1], aspect="auto")
net = unet.Unet(channels=generator.channels, n_class=generator.n_class, layers=3, features_root=16)
2018-10-12 11:39:22,485 Layers 3, features 16, filter size 3x3, pool size: 2x2
trainer = unet.Trainer(net, optimizer="momentum", opt_kwargs=dict(momentum=0.2))
path = ain(generator, "./unet_trained", training_iters=20, epochs=10, display_step=2)
2018-10-12 11:39:29,649 Removing '/home/songruoning/tf_unet/prediction'
2018-10-12 11:39:29,660 Removing '/home/songruoning/tf_unet/unet_trained'
2018-10-12 11:39:29,707 Allocating '/home/songruoning/tf_unet/prediction'
2018-10-12 11:39:29,712 Allocating '/home/songruoning/tf_unet/unet_trained'
tensorflow版本选择2018-10-12 11:39:36,322 Verification error= 84.5%, loss= 0.7055
2018-10-12 11:39:37,783 Start optimization
2018-10-12 11:39:39,371 Iter 0, Minibatch Loss= 0.6087, Training Accuracy= 0.8474, Minibatch error= 15.3% 2018-10-12 11:39:40,313 Iter 2, Minibatch Loss= 0.5519, Training Accuracy= 0.8113, Minibatch error= 18.9% 2018-10-12 11:39:41,311 Iter 4, Minibatch Loss= 0.5442, Training Accuracy= 0.7797, Minibatch error= 22.0% 2018-10-12 11:39:42,400 Iter 6, Minibatch Loss= 0.4557, Training Accuracy= 0.8398, Minibatch error= 16.0% 2018-10-12 11:39:43,265 Iter 8, Minibatch Loss= 0.4258, Training Accuracy= 0.8523, Minibatch error= 14.8% 2018-10-12 11:39:44,239 Iter 10, Minibatch Loss= 0.4505, Training Accuracy= 0.8334, Minibatch error= 16.7% ......
2018-10-12 11:41:30,394 Iter 178, Minibatch Loss= 0.2114, Training Accuracy= 0.9058, Minibatch error= 9.4% 2018-10-12 11:41:30,749 Epoch 8, Average loss: 0.2805, learning rate: 0.1327
2018-10-12 11:41:30,940 Verification error= 7.3%, loss= 0.2117
2018-10-12 11:41:33,629 Iter 180, Minibatch Loss= 0.1934, Training Accuracy= 0.9368, Minibatch erro
r= 6.3% 2018-10-12 11:41:34,705 Iter 182, Minibatch Loss= 0.1574, Training Accuracy= 0.9487, Minibatch error= 5.1% 2018-10-12 11:41:35,795 Iter 184, Minibatch Loss= 0.1452, Training Accuracy= 0.9536, Minibatch error= 4.6% 2018-10-12 11:41:36,947 Iter 186, Minibatch Loss= 0.1582, Training Accuracy= 0.9421, Minibatch error= 5.8% 2018-10-12 11:41:37,928 Iter 188, Minibatch Loss= 0.1466, Training Accuracy= 0.9388, Minibatch error= 6.1% 2018-10-12 11:41:39,022 Iter 190, Minibatch Loss= 0.1780, Training Accuracy= 0.9360, Minibatch error= 6.4% 2018-10-12 11:41:40,164 Iter 192, Minibatch Loss= 0.1010, Training Accuracy= 0.9702, Minibatch error= 3.0% 2018-10-12 11:41:41,344 Iter 194, Minibatch Loss= 0.2422, Training Accuracy= 0.9180, Minibatch error= 8.2% 2018-10-12 11:41:42,465 Iter 196, Minibatch Loss= 0.1394, Training Accuracy= 0.9664, Minibatch error= 3.4% 2018-10-12 11:41:43,670 Iter 198, Minibatch Loss= 0.1302, Training Accuracy= 0.9575, Minibatch error= 4.2% 2018-10-12 11:41:44,016 Epoch 9, Average loss: 0.1315, learning rate: 0.1260
2018-10-12 11:41:44,713 Verification error= 3.9%, loss= 0.1573
2018-10-12 11:41:46,402 Optimization Finished!
x_test, y_test = generator(1)
prediction = net.predict("./unet_trained/model.cpkt", x_test)
INFO:tensorflow:Restoring parameters from ./unet_trained/model.cpkt
2018-10-12 11:42:14,147 Restoring parameters from ./unet_trained/model.cpkt
2018-10-12 11:42:15,091 Model restored from file: ./unet_trained/model.cpkt
fig, ax = plt.subplots(1, 3, sharex=True, sharey=True, figsize=(12,5))
ax[0].imshow(x_test[0,...,0], aspect="auto")
ax[1].imshow(y_test[0,...,1], aspect="auto")
mask = prediction[0,...,1] > 0.9
ax[2].imshow(mask, aspect="auto")
ax[0].set_title("Input")
ax[1].set_title("Ground truth")
ax[2].set_title("Prediction")
fig.tight_layout()
fig.savefig("docs/toy_problem.png")
⽆线电频率⼲扰的监测 link
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论