Skip to content

Commit 7079fca

Browse files
smurchingthunterdb
authored andcommitted
Update version numbers to 0.2.9-rc1, update python dependencies (databricks#124)
* Update version number to 0.2.9-rc1, update pandas version from 0.19.0 -> 0.19.1 * Update readme to indicate that pandas is now a tensorframes dependency * Update python dependencies to specify minimum version requirements instead of absolute version requirements
1 parent 2e22218 commit 7079fca

File tree

3 files changed

+10
-8
lines changed

3 files changed

+10
-8
lines changed

README.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,8 @@ TensorFrames is available as a
4040
[official instructions](https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#download-and-setup)
4141
on how to get the latest release of TensorFlow.
4242

43+
- (Optional) pandas >= 0.19.1 if you want to use the python interface
44+
4345
- (Optional) the [Nix package manager](http://nixos.org/nix/) if you want to guarantee a fully reproducible build environment. This is the environment that will be used for reproducing bugs.
4446

4547
Additionally, if you want to run unit tests for python, you need the following dependencies:
@@ -52,7 +54,7 @@ Additionally, if you want to run unit tests for python, you need the following d
5254
Assuming that `SPARK_HOME` is set, you can use PySpark like any other Spark package.
5355

5456
```bash
55-
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc0-s_2.11
57+
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc1-s_2.11
5658
```
5759

5860
Here is a small program that uses Tensorflow to add 3 to an existing column.
@@ -150,7 +152,7 @@ The scala support is a bit more limited than python. In scala, operations can be
150152
You simply use the published package:
151153

152154
```bash
153-
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc0
155+
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc1
154156
```
155157

156158
Here is the same program as before:
@@ -200,14 +202,14 @@ build/sbt distribution/spDist
200202
Assuming that SPARK_HOME is set and that you are in the root directory of the project:
201203

202204
```bash
203-
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar
205+
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar
204206
```
205207

206208
If you want to run the python version:
207209

208210
```bash
209-
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar \
210-
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar
211+
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar \
212+
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar
211213
```
212214

213215
## Acknowledgements

project/Build.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ object Shading extends Build {
1111

1212

1313
lazy val commonSettings = Seq(
14-
version := "0.2.9-rc0",
14+
version := "0.2.9-rc1",
1515
name := "tensorframes",
1616
scalaVersion := sys.props.getOrElse("scala.version", "2.11.8"),
1717
organization := "databricks",

python/requirements.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
# This file should list any python package dependencies.
2-
nose==1.3.3
3-
pandas==0.19.0
2+
nose>=1.3.3
3+
pandas>=0.19.1

0 commit comments

Comments
 (0)