Sunday, 23 April 2017

Accessing Externally Hosted Mongo-DB inside kuberenetes or minikube

Accessing Externally Hosted Mongo-DB in the Kubernetes/Minikube PODS

We can directly access the externally hosted mongo-db inside the pod, by using it public IP.

But, suppose in future, if the externally hosted mongo dB IP changes, then we would require to update all the pods, that are accessing the database.

The better option is creating a service without selector. So, for it no endpoints will be created. After that we will manually create the endpoints and provide the externally hosted mongodb address there.

In this way, when we access the service, it will automatically route to the end points created for it.

And, if later on if public IP changes for the DB, we will not require to update the pods, just we will require to update the end points.

Below are the json files for it:

  1.   Mongodb Service without Selector:


{
                            "kind": "Service",
                            "apiVersion": "v1",
                            "metadata":{
                                name: "mongodb"
                            },
                            "spec":{
                                "ports":[{
                                    "protocol": "TCP",
                                    "port": 27017,
                                    "targetPort": 27017
                                }]
                            },
                            "selector": {}                  
                }
As you can see, we have not provided any selector for it. So, there will be no end point created for it.
By default the mongo-db is accessible to the default port (27017).

2. End-Point for the above service:

{
                        "kind": "Endpoints",
                        "apiVersion": "v1",
                        "metadata":{
                            "name": "mongodb"
                        },
                        "subsets":[{
                            "addresses":[{
                                "ip": "30.188.60.252",  // This is the external end point.
                                }],
                            "ports":[{
                                "port": 27017,
                                "protocol": "TCP"
                            }]
                        }]
            }

Note : The name property value must match with the newly created service name. That's how both the endpoints and service will associate.



And to access the external mongo-db now, We can just use the service name directly.

Ex :  let constr = "mongodb://abcd:abcd@mongodb";

Or, we can use the service IP also that will proxy the traffic to the endpoint. 

  

Sunday, 16 April 2017

Using SharePoint Designer 2013 workflow, update/create item in other site collection.

Here in this blog, i am taking the general scenario of "updating/creating" a list item from one site collection to another site collection.
The Statement:
"I have a master list in one site collection and copy of the same list in another site collection. I would like to make sure that whenever there is an update to an item on the master list, the same item in the copy list in the other site collection gets updated ."
Implementation :
Steps in brief :
  1. We will use the rest API in the SPD workflow for the cross site collection call.
  2. On the source master list, we will write two workflows (2013 template) one on item added and the other one on item updated.
  3. The item added workflow will create the copy of list item in the destination list using the rest API.
  4. The item updated workflow will update that copy in the destination list, whenever the master list entry will update.
  5. For allowing the master list workflow to create/update entry in other site collection, we need to provide the workflow a app permission on the target site.
  6. List Schema  :
Master List : List Name "Employee" , Columns : {"EmpName", "CTC"}
Destination List :
List Name "Employee Backup", Columns {"EmpName", "CTC","MasterListItemID"}
MasterListItemID : will hold the item id of the master list item.
Let's create first workflow (on Item Added) :

3 dictionary variables will require for the post request :
  1. header : which will contain the accept/Content-type keys with the same value "application/json;odata=verbose"

Header Dictionary contains accept & content-type keys

2. metadata : the metadata dictionary will contain only one key "type" and the value will be SP.Data.[title of target list]ListItem


3. parameters : this dictionary will contain the key  __metadata & the columns values.

the "MasterListItemId" column will hold a current item ID.
Parameter dictionary..(the __metadata will be of type dictionary)

Will associate all these dictionary variables with the "call a web-service" action. The method will be a "Post" as we are creating a item.
Request URL will be : https://targetsitecollection/_api/web/lists/getbytitle('Target_List_Name')/Items

Save this workflow and update the settings to trigger it on item added only.
Now, to allow this workflow to create item in the target site collection, we need to provide a permission to this workflow.
Go to site settings of the source site(holding master list) -> site app permissions


Copy the App identifier (ID between last | and @) for next steps (n my example this was 8f20f240-ddde-45dc-a08a-66834769220d)
Now, manually add this app identifier on the target site collection (site holding the target list). To do this follow the below steps :
a. Open the appinv.aspx page :
http://{the Site Collection}/{target-site}/_layouts/15/appinv.aspx.
b. Paste your App identifier of the source site, lookup the rest of the information and use the following XML to the App’s Permission Request XML.
And now the first part has been done, now lets publish the workflow and create a new item in the master list and check the workflow status, if it is completed than the item will be successfully created in the list.
Note : Make sure that the user who is creating the item in master list, also have a permission on the target site collection/site. Or else, we can write the actions under "APP STEP" in the workflow.
-----------------------------------------------------------------------

Now, lets write a second workflow, which will trigger when the master list item has been updated.
In the item updated workflow will have two stages :
  1. Fetch the item ID of the "copy of the master list item/target list item". Because for updating the target list item using rest api, we will require it item id.
  2. Updating the target list item.
As we have stored the master list item ID in the target list item field (MasterListItemID). So, whenever the master list item will update, we will take the current item ID and will query the target list for fetching the item whose "MasterListItemID" is equal to the current item Id.


So, we will write a first request as :
https://target_site_collection/_api/web/lists/getbytitle('target_list')/Items?$filter=MasterListItemID eq [%Current Item:ID%]
which will in turn return the Item ID of the target list item Id and then we will use that ID to update the list item.
As for now, we are expecting only one result in response. So, for now i am directly using "d/results(0)/ID" for getting the item ID. Later on we can check the response Item count and do some validation.
header dictionary will just hold two keys (Accept/Content-Type) and the request will a "GET" request.

Now, as the first stage is completed, will write a second stage for updating the item.

This stage will also contain 3 dictionaries object (header,metadata,parameters).
metadata & parameters dictionary will be same as the first workflow. So, create the dictionary same as mentioned in the "Item Added Workflow"
header dictionary will contain two additional keys as we are updating the existing item.
X-HTTP-Method and If-Match are the two additional keys.

In the "call a web service" action we will point to the target list item and will use the item id that we have stored in the previous stage.
https://target_site_collection/_api/web/lists/getbytitle('target_list')/items([%Variable:SecondaryListItemID%])
And now, just save and publish the workflow and try updating the item.
Note : Only once we need to provide a permission. And we have done it while writing the first workflow. 
References :
http://blog.portiva.nl/2016/11/03/sharepoint-designer-call-http-web-service-to-create-item-in-other-site-collection/

Using Google Apps Script, Upload file to google drive and insert data into spreadsheet

For inserting the data into the spreadsheet, i have created a sample form which is having a limited no of fields. And on the form action, i am submitting a post request to the server.
The action tag in the form is pointing to the google app script end point.
Please do watch this video for better clarity.

Also, to upload multiple files in the google drive, please refer the
updated code in the below mentioned repository.

Upload multiple files to google drive using google app script.

Here is the snippet for the html form:
<article id="content1" contenteditable="true">
<p>
<form id="uploadForm" action="Your script end point" method="POST">
<input type="hidden" value="" name="fileContent" id="fileContent">
<input type="hidden" value="" name="filename" id="filename">
<label> Name : </label><input required type="text" value="" name="name" id="name">
<label> Email :</label> <input required type="text" value="" name="email" id="email">
<label> Contact : </label><input required type="text" value="" name="contact" id="contact">
<label> SkillSets :</label> <input required type="text" value="" name="skillsets" id="skillsets">
<label> LinkedIn Account:</label><input type="text" value="" name="linkedinUrl" id="linkedinUrl">
</form>
<input required id="attach" name="attach" type="file"/>
<input value="Submit" type="button" onclick="UploadFile();" />
function UploadFile() {
var reader = new FileReader();
var file = document.getElementById('attach').files[0];
reader.onload = function(){
document.getElementById('fileContent').value=reader.result;
document.getElementById('filename').value=file.name;
document.getElementById('uploadForm').submit();
}
reader.readAsDataURL(file);
}
</p>
</article>
And Here is the google script snippet :
<article id="content2" contenteditable="true">
<p>
// Do change it your email address.
var emailTo= "emailaddress@anydomain.com"
function doPost(e) {
try {
var data = e.parameter.fileContent;
var filename = e.parameter.filename;
var email = e.parameter.email;
var name = e.parameter.name;
var result=uploadFileToGoogleDrive(data,filename,name,email,e);
return ContentService // return json success results
.createTextOutput(
JSON.stringify({"result":"success",
"data": JSON.stringify(result) }))
.setMimeType(ContentService.MimeType.JSON);
} catch(error) { // if error return this
Logger.log(error);
return ContentService
.createTextOutput(JSON.stringify({"result":"error", "error": error}))
.setMimeType(ContentService.MimeType.JSON);
}
}
// new property service GLOBAL
var SCRIPT_PROP = PropertiesService.getScriptProperties();
// see: https://developers.google.com/apps-script/reference/properties/
/**
* select the sheet
*/
function setup() {
var doc = SpreadsheetApp.getActiveSpreadsheet();
SCRIPT_PROP.setProperty("key", doc.getId());
}
/**
* record_data inserts the data received from the html form submission
* e is the data received from the POST
*/
function record_data(e,fileUrl) {
try {
var doc = SpreadsheetApp.openById(SCRIPT_PROP.getProperty("key"));
var sheet = doc.getSheetByName('responses'); // select the responses sheet
var headers = sheet.getRange(1, 1, 1, sheet.getLastColumn()).getValues()[0];
var nextRow = sheet.getLastRow()+1; // get next row
var row = [ new Date() ]; // first element in the row should always be a timestamp
// loop through the header columns
for (var i = 1; i < headers.length; i++) { // start at 1 to avoid Timestamp column
if(headers[i].length > 0 && headers[i] == "resume") {
row.push(fileUrl); // add data to row
}
else if(headers[i].length > 0) {
row.push(e.parameter[headers[i]]); // add data to row
}
}
// more efficient to set values as [][] array than individually
sheet.getRange(nextRow, 1, 1, row.length).setValues([row]);
}
catch(error) {
Logger.log(e);
}
finally {
return;
}
}
function uploadFileToGoogleDrive(data, file, name, email,e) {
try {
var dropbox = "Demo";
var folder, folders = DriveApp.getFoldersByName(dropbox);
if (folders.hasNext()) {
folder = folders.next();
} else {
folder = DriveApp.createFolder(dropbox);
}
var contentType = data.substring(5,data.indexOf(';')),
bytes = Utilities.base64Decode(data.substr(data.indexOf('base64,')+7)),
blob = Utilities.newBlob(bytes, contentType, file);
var file = folder.createFolder([name, email].join("-")).createFile(blob);
var fileUrl=file.getUrl();
//Generating Email Body
var html =
'<body>' +
'<h2> New Job Application </h2>' +
'<p>Name : '+e.parameters.name+'</p>' +
'<p>Email : '+e.parameters.email+'</p>' +
'<p>Contact : '+e.parameters.contact+'</p>' +
'<p>Skill Sets : '+e.parameters.skillsets+'</p>' +
'<p>LinkedIn Url : '+e.parameters.linkedinUrl+'</p>' +
'<p>File Name : '+e.parameters.filename+'</p>' +
'<p><a href='+file.getUrl()+'>Resume Link</a></p><br />' +
'</body>';
record_data(e,fileUrl);
MailApp.sendEmail(emailTo, "New Job Application Recieved","New Job Application Request Recieved",{htmlBody:html});
return file.getUrl();
} catch (f) {
return ContentService // return json success results
.createTextOutput(
JSON.stringify({"result":"file upload failed",
"data": JSON.stringify(f) }))
.setMimeType(ContentService.MimeType.JSON);
}
}</p>
</article>


How to call a web service/Rest API in SharePoint designer workflow (2013)

Hello Guys, 

Here is the detailed video of calling a Rest API/web service in SharePoint Designer 2013 workflow. Calling a Rest API in SharePoint Designer Workflow.


Calling a Rest API in SharePoint Designer Workflow


Thanks, 
Utkarsh