-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Labels
Description
beego manual
Logic
(to do https://github.com/feedlabs/elasticfeed/issues)
- plugins manager (internal API: storages access, run-time state; communication)
- plugins (initial scenario, pipeline, indexer, sensor)
- workflows manager (hooks distribution, in-out plugin flow, workflow-file templeting)
- analytics storage (feed/entries metrics)
- plugins storage (user/app/feed data)
- runtime storage (sensors: location, weather, night/day, moon phase, sun activity)
- responses/requests templates
- caching with
groupcache- feed/entries storage
- analytics storage
- plugins storage
- runtime sensors state
- distributed computing for streaming
- multi node simple pub/sub communication (own protocol)
- multi node pub/sub nased on redis pub/sub (large production)
- entity vs controller: entity/model should contains all public/private params; controller should define interface for request/response for each api version separately
- resource
- apply ordering
- pagingination
- response
- HTTP RESPONSE CODEs
- lower all IDs when json send to client (should be defined by controller for each api version)
- paging marker: last timestamp
-
pretty=trueprinting - ordering
- request
- HEADERS:
Content-Type: application/json - ordering
- paging: timestamp-to-timestamp OR timestamp-to-requiredItemNumber
- http: timestamp, limit/timestamp, direction (asc/desc)
- websocket: timestamp, limit/timestamp, direction (asc/desc)
- HEADERS:
- stream (Websocket interface for streaming elasticfeed#23)
- channel/feed - event/action - data
- channel/room -> feedAPI specific
- event -> ACTION: ADD, EDIT, DELETE, ??? (MUST BE HANDLED BY FRONT END PLUGIN)
- data -> full data or changes
- authorisation Basic and digest authetication #11 ; docu ; should use SSL; curl, ajax, websockets
- app: basic + digest: token based
- onUpdate: merge new data with old stored data based on ID
- re-stream user data
- linking entries to feed pages
Deployment
-
elasticfeedinstalled with puppet module (defined as puppet class for nodes) -
elasticfeedlinux package -
elasticfeedcross platform support (binary release)
Feed
- this is LIST
- contains ENTRIES
- ENTRY contains TIMESTAMP
- TIMESTAMP provides TIMEAXIS ordering
-
can be displayed as: LIST, AGGREGATED BY TIME or TAGS - there are 3 types: PRIVATE, PUBLIC, GROUP, GLOBAL
Prototype
Stream:
- stream_adapter
- stream_adapter_socket_redis (&redis_client)
- stream_message (&stream_adapter)
- stream_store
ServiceManager:
- service_graph (&config)
- service_stream (&config)
- service_cache (&config)
- service_js (&config)
Redis
- redis_client
Neo4j #6
- neo4j_client
Memcache #6
- memcache_client
Graph #6
- graph_adapter
- graph_neo4j_cypher (&neo4j_client)
- graph_entity
- graph_node
- graph_relation
- graph_database
- graph_query
- graph_storage (&graph_adapter)
- graph_store
Basics
Standards:
- FOAF http://xmlns.com/foaf/spec/
- FOAF Project http://www.foaf-project.org/
- N-Quad (RDF) http://www.w3.org/TR/n-quads/
- JSON-LD http://en.wikipedia.org/wiki/JSON-LD
- Gremlin http://gremlindocs.com/
- GremlinAPI from cayley https://github.com/google/cayley/blob/master/docs/GremlinAPI.md
Thoughts
tips:
- cayley stores simple types (string, int, date) and relations
_:subject1 <http://an.example/predicate1> "object1" <http://example.org/graph1> .
- mongo could store
objectand then create incayleyrelation if needed. Each query would require 2 call; One to cayley and second to mongo
Gremlin
curl -d '[{"subject":"bob","predicate":"follows", "object":"dani"}]' localhost:64210/api/v1/writecurl -d "g.V('alice').Out('follows').Intersect(g.V('bob').Out('follows')).All()" localhost:64210/api/v1/query/gremlincurl -d "g.V('alice').Out('follows').All()" localhost:64210/api/v1/query/gremlin
Important questions; b2b
0. what is feedlabs and what is it for?
1. how can we improve your content?
2. how can we increase your users experience?
3. how can we improve your application UI?
4. how can we speed up your data delivery
5. how can we safe you computing instances?
6. how can we safe you storage space?
7. how can we improve you analytics?
8. how can you track your users behaviours?
9. how can we improve your knowledge about users expectations?
10. how can we index your content?
11. how can we suggest better content?
12. why should I use new DSL language?
13. how can you make more money with us?
14. how can I migrate my data?