Skip to content

Commit 7e0cd1d

Browse files
committed
[SPARK-18073][DOCS][WIP] Migrate wiki to spark.apache.org web site
## What changes were proposed in this pull request? Updates links to the wiki to links to the new location of content on spark.apache.org. ## How was this patch tested? Doc builds Author: Sean Owen <[email protected]> Closes apache#15967 from srowen/SPARK-18073.1.
1 parent 2559fb4 commit 7e0cd1d

File tree

13 files changed

+23
-23
lines changed

13 files changed

+23
-23
lines changed

.github/PULL_REQUEST_TEMPLATE

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@
77
(Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests)
88
(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
99

10-
Please review https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark before opening a pull request.
10+
Please review http://spark.apache.org/contributing.html before opening a pull request.

CONTRIBUTING.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
## Contributing to Spark
22

33
*Before opening a pull request*, review the
4-
[Contributing to Spark wiki](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark).
4+
[Contributing to Spark guide](http://spark.apache.org/contributing.html).
55
It lists steps that are required before creating a PR. In particular, consider:
66

77
- Is the change important and ready enough to ask the community to spend time reviewing?
88
- Have you searched for existing, related JIRAs and pull requests?
9-
- Is this a new feature that can stand alone as a [third party project](https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects) ?
9+
- Is this a new feature that can stand alone as a [third party project](http://spark.apache.org/third-party-projects.html) ?
1010
- Is the change being proposed clearly explained and motivated?
1111

1212
When you contribute code, you affirm that the contribution is your original work and that you

R/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ sparkR.session()
5151

5252
#### Making changes to SparkR
5353

54-
The [instructions](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark) for making contributions to Spark also apply to SparkR.
54+
The [instructions](http://spark.apache.org/contributing.html) for making contributions to Spark also apply to SparkR.
5555
If you only make R file changes (i.e. no Scala changes) then you can just re-install the R package using `R/install-dev.sh` and test your changes.
5656
Once you have made your changes, please include unit tests for them and run existing unit tests using the `R/run-tests.sh` script as described below.
5757

R/pkg/DESCRIPTION

+1-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),
1111
email = "[email protected]"),
1212
person(family = "The Apache Software Foundation", role = c("aut", "cph")))
1313
URL: http://www.apache.org/ http://spark.apache.org/
14-
BugReports: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-ContributingBugReports
14+
BugReports: http://spark.apache.org/contributing.html
1515
Depends:
1616
R (>= 3.0),
1717
methods

README.md

+6-5
Original file line numberDiff line numberDiff line change
@@ -29,8 +29,9 @@ To build Spark and its example programs, run:
2929
You can build Spark using more than one thread by using the -T option with Maven, see ["Parallel builds in Maven 3"](https://cwiki.apache.org/confluence/display/MAVEN/Parallel+builds+in+Maven+3).
3030
More detailed documentation is available from the project site, at
3131
["Building Spark"](http://spark.apache.org/docs/latest/building-spark.html).
32-
For developing Spark using an IDE, see [Eclipse](https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-Eclipse)
33-
and [IntelliJ](https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ).
32+
33+
For general development tips, including info on developing Spark using an IDE, see
34+
[http://spark.apache.org/developer-tools.html](the Useful Developer Tools page).
3435

3536
## Interactive Scala Shell
3637

@@ -80,7 +81,7 @@ can be run using:
8081
./dev/run-tests
8182

8283
Please see the guidance on how to
83-
[run tests for a module, or individual tests](https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools).
84+
[run tests for a module, or individual tests](http://spark.apache.org/developer-tools.html#individual-tests).
8485

8586
## A Note About Hadoop Versions
8687

@@ -100,5 +101,5 @@ in the online documentation for an overview on how to configure Spark.
100101

101102
## Contributing
102103

103-
Please review the [Contribution to Spark](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark)
104-
wiki for information on how to get started contributing to the project.
104+
Please review the [Contribution to Spark guide](http://spark.apache.org/contributing.html)
105+
for information on how to get started contributing to the project.

dev/checkstyle.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
2929
with Spark-specific changes from:
3030
31-
https://cwiki.apache.org/confluence/display/SPARK/Spark+Code+Style+Guide
31+
http://spark.apache.org/contributing.html#code-style-guide
3232
3333
Checkstyle is very configurable. Be sure to read the documentation at
3434
http://checkstyle.sf.net (or in your downloaded distribution).

docs/_layouts/global.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -113,8 +113,8 @@
113113
<li><a href="hardware-provisioning.html">Hardware Provisioning</a></li>
114114
<li class="divider"></li>
115115
<li><a href="building-spark.html">Building Spark</a></li>
116-
<li><a href="https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark">Contributing to Spark</a></li>
117-
<li><a href="https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects">Third Party Projects</a></li>
116+
<li><a href="http://spark.apache.org/contributing.html">Contributing to Spark</a></li>
117+
<li><a href="http://spark.apache.org/third-party-projects.html">Third Party Projects</a></li>
118118
</ul>
119119
</li>
120120
</ul>

docs/building-spark.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -197,7 +197,7 @@ can be set to control the SBT build. For example:
197197
To avoid the overhead of launching sbt each time you need to re-compile, you can launch sbt
198198
in interactive mode by running `build/sbt`, and then run all build commands at the command
199199
prompt. For more recommendations on reducing build time, refer to the
200-
[wiki page](https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-ReducingBuildTimes).
200+
[Useful Developer Tools page](http://spark.apache.org/developer-tools.html).
201201

202202
## Encrypted Filesystems
203203

@@ -215,7 +215,7 @@ to the `sharedSettings` val. See also [this PR](https://github.com/apache/spark/
215215
## IntelliJ IDEA or Eclipse
216216

217217
For help in setting up IntelliJ IDEA or Eclipse for Spark development, and troubleshooting, refer to the
218-
[wiki page for IDE setup](https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup).
218+
[Useful Developer Tools page](http://spark.apache.org/developer-tools.html).
219219

220220

221221
# Running Tests

docs/contributing-to-spark.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,4 @@ title: Contributing to Spark
55

66
The Spark team welcomes all forms of contributions, including bug reports, documentation or patches.
77
For the newest information on how to contribute to the project, please read the
8-
[wiki page on contributing to Spark](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark).
8+
[Contributing to Spark guide](http://spark.apache.org/contributing.html).

docs/index.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -125,8 +125,8 @@ options for deployment:
125125
* Integration with other storage systems:
126126
* [OpenStack Swift](storage-openstack-swift.html)
127127
* [Building Spark](building-spark.html): build Spark using the Maven system
128-
* [Contributing to Spark](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark)
129-
* [Third Party Projects](https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects): related third party Spark projects
128+
* [Contributing to Spark](http://spark.apache.org/contributing.html)
129+
* [Third Party Projects](http://spark.apache.org/third-party-projects.html): related third party Spark projects
130130

131131
**External Resources:**
132132

docs/sparkr.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ head(df)
126126
SparkR supports operating on a variety of data sources through the `SparkDataFrame` interface. This section describes the general methods for loading and saving data using Data Sources. You can check the Spark SQL programming guide for more [specific options](sql-programming-guide.html#manually-specifying-options) that are available for the built-in data sources.
127127

128128
The general method for creating SparkDataFrames from data sources is `read.df`. This method takes in the path for the file to load and the type of data source, and the currently active SparkSession will be used automatically.
129-
SparkR supports reading JSON, CSV and Parquet files natively, and through packages available from sources like [Third Party Projects](https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects), you can find data source connectors for popular file formats like Avro. These packages can either be added by
129+
SparkR supports reading JSON, CSV and Parquet files natively, and through packages available from sources like [Third Party Projects](http://spark.apache.org/third-party-projects.html), you can find data source connectors for popular file formats like Avro. These packages can either be added by
130130
specifying `--packages` with `spark-submit` or `sparkR` commands, or if initializing SparkSession with `sparkPackages` parameter when in an interactive R shell or from RStudio.
131131

132132
<div data-lang="r" markdown="1">

docs/streaming-programming-guide.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2382,7 +2382,7 @@ additional effort may be necessary to achieve exactly-once semantics. There are
23822382
- [Kafka Integration Guide](streaming-kafka-integration.html)
23832383
- [Kinesis Integration Guide](streaming-kinesis-integration.html)
23842384
- [Custom Receiver Guide](streaming-custom-receivers.html)
2385-
* Third-party DStream data sources can be found in [Third Party Projects](https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects)
2385+
* Third-party DStream data sources can be found in [Third Party Projects](http://spark.apache.org/third-party-projects.html)
23862386
* API documentation
23872387
- Scala docs
23882388
* [StreamingContext](api/scala/index.html#org.apache.spark.streaming.StreamingContext) and

sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala

+2-3
Original file line numberDiff line numberDiff line change
@@ -505,12 +505,11 @@ object DataSource {
505505
provider1 == "com.databricks.spark.avro") {
506506
throw new AnalysisException(
507507
s"Failed to find data source: ${provider1.toLowerCase}. Please find an Avro " +
508-
"package at " +
509-
"https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects")
508+
"package at http://spark.apache.org/third-party-projects.html")
510509
} else {
511510
throw new ClassNotFoundException(
512511
s"Failed to find data source: $provider1. Please find packages at " +
513-
"https://cwiki.apache.org/confluence/display/SPARK/Third+Party+Projects",
512+
"http://spark.apache.org/third-party-projects.html",
514513
error)
515514
}
516515
}

0 commit comments

Comments
 (0)