To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws). Then, custum endpoints can be configured according to docs.
bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2
| version: '2' | |
| services: | |
| openldap: | |
| image: osixia/openldap:1.2.3 | |
| container_name: openldap | |
| environment: | |
| LDAP_LOG_LEVEL: "256" | |
| LDAP_ORGANISATION: "Example Inc." | |
| LDAP_DOMAIN: "example.org" | |
| LDAP_BASE_DN: "" |
| #cloud-config | |
| resize_rootfs: false | |
| disk_setup: | |
| /dev/sda: | |
| table_type: 'mbr' | |
| layout: | |
| - 25 | |
| - 75 | |
| overwrite: true |
To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws). Then, custum endpoints can be configured according to docs.
bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2
| # Ruby is our language as asciidoctor is a ruby gem. | |
| lang: ruby | |
| before_install: | |
| - sudo apt-get install pandoc | |
| - gem install asciidoctor | |
| script: | |
| - make | |
| after_success: | |
| - .travis/push.sh | |
| env: |
| '''Example of a custom ReST directive in Python docutils''' | |
| import docutils.core | |
| from docutils.nodes import TextElement, Inline | |
| from docutils.parsers.rst import Directive, directives | |
| from docutils.writers.html4css1 import Writer, HTMLTranslator | |
| class foo(Inline, TextElement): | |
| '''This node class is a no-op -- just a fun way to define some parameters. | |
| There are lots of base classes to choose from in `docutils.nodes`. |
| <!doctype html> | |
| <html> | |
| <head> | |
| <meta charset="utf-8"> | |
| <title>Page Title</title> | |
| <meta name="description" content="Webpage for xxxx"> | |
| <!-- http://meyerweb.com/eric/tools/css/reset/ --> | |
| <link rel="stylesheet" href="css/reset/reset.css"> | |
| <!--[if lt IE 9]> | |
| <script src="//html5shiv.googlecode.com/svn/trunk/html5.js" type="text/javascript"></script> |