سیاره دروپال

Drupal.org - aggregated feeds in category Planet Drupal
Subscribe to خوراک سیاره دروپال
Only a few mores sleeps until BADCamp! Grace Lovelace Fri, 10/13/2017 - 12:12pm

BADCamp is almost here! Just five more sleeps to go. We’d like to share some details about event logistics and making the most of your time at BADCamp.

Make Sure You Are Registered!

While BADCamp is both awesome and free, signing up for BADCamp helps us plan and ensures you receive event specific information.

 

Want to be Trained? You Need to Sign Up for Free Training

A few last minute cancellations means we have a few seats still available. Sign up soon to reserve your spot!

 

Want to Summit? You Need to Sign up for Summits

Wednesday and Thursday, we'll be hosting great summits that facilitate conversations and connections with people in specific industries or with specific skills. Come dive deep into the issues that matter and collaborate freely. Registration is open and while attendance is free, signing up will ensure you receive summit specific information for the event.

 

Want to Make A Session Schedule?

Are you a super planner? Make your session schedule in advance and then follow along on your mobile device! Take a look at the final session schedule. There are BADCamp sessions spanning the worlds of development, design, strategy, project management, technology communities and everything in between.

  Join us at the Contribution Lounge for coffee, community, and code!

This is a great chance to help make Drupal 8 bigger and better. The BADCamp Contribution Lounge is at the Hotel Shattuck. The Lounge has Internet access and an ample supply of coffee and water. We're open around the clock from Wednesday, October 18 at 9 am to Saturday, October 21 at 10 pm. Come participate!

 

Parties

BADCamp Party on Friday Night

Come to the official BADCamp party at The Marsh Theatre on Friday night.  Doors open earlier than planned at 6:30 to maximize our time on the ROOF TOP!

Please find our generous sponsors, Platform.sh, Hook42 and Lullabot. Give them a BIG thanks as this party would not have been possible without their generous support.

We will have drink tickets burning a hole in our pocket, so come early and be prepared for a good time. There will be great music, and ample space on the Dance Floor. There will also be tables and quiet areas to chat. For more info...

 

We need your help: Volunteer for BADCamp!

BADCamp is 100% volunteer driven and we need your hands! We need stout hearts to volunteer and help set up, tear down, give directions and so much more!  If you are local and can help us, please contact Manish at info@badcamp.net or sign up on our Volunteer Form.

 

Would you have been willing to pay for your ticket?  

If so, then you can give back to the camp by purchasing an individual sponsorship at the level most comfortable for you. As our thanks, we will be handing out some awesome BADCamp swag including a 2017 edition t-shirt, hoodie and stellar solar charger.

 

Sponsors

A BIG thanks to our sponsors! Without them this magical event wouldn’t be possible. An extra big thanks to Platform.sh, Pantheon & Acquia for sponsoring at the Core level to help keep BADCamp free and awesome.

Drupal Planet

Recently I've been thinking a lot about what is missing that could help the Drupal project achieve greater success. This was partly in preparation for the Drupal Strategy Summit but also a continuation of research I was already working on.

Many Drupal friendships a created in issue queues, over IRC, Twitter or in Google Hangouts and often across continents. I can think of many personal examples. Maybe you can too?. So it seems incongruous to me that Drupal.org has no community search feature. We are one of the world's biggest communities but no way to find one another.

There is a helpful Where is the Drupal Community?page but it lacks the ability to search of people like me, people having shared interests, shared motivations to contribute, people I can collaborate with.I feel like this is a massive missed opportunity to connect like minds, if such a tool existed new comers are far more likely to have a positive experience and find an outlet for their passion.

I have written a proposal in the Issue Queue for Drupal.org content. If you have thoughts around this feature request, I'd appreciate you joining the conversation.

File Entity module co-maintainers Devin Carlson and Dave Reid meet for the first time atfter a Media BoF at DrupalCon Portland, 2013. They live in Sudbury, Canada and Omaha, Nebraska, respectively. Thousands of similar friendships are formed though Drupal contributions. Photo by Ezra Gildesgame

Have you ever wondered how the text or email or entity reference field is extended in Drupal 8? Or how to create a custom field/widget/formatter so that it can match with the rest of fields in your Drupal application? This blog will cover everything required to extend existing field widgets in Drupal 8 using annotation plugin. 

Many developers, who recently started working on Drupal 8, may not be aware of an entire process so let’s take a closer look to everything step-by-step. Key comparisons between Drupal 7 and Drupal 8, what is an annotation, why annotation and sample use case from an Inline Entity Form step-by-step. After completing this post, you will be able to extend the field with your own methods/functions without…

Config Split treats all the cli the same.

Drupal 8.4 and its upgrade to Symfony 3 has made the compatibility of the global Drush 8 a bit more challenging. Drush 9 works with Drupal 8.4 but it is not stable yet and the format of how third party Drush commands are made has changed significantly.

While Drush 9 comes with a command that helps porting Drush 8 commands you will still end up maintaining very similar code in two places one with calls to drush_confirm('...') and one with $this->io()->confirm('...'). If you decide to also provide your commands for Drupal console you now have three times the burden.

Because we tried to provide the commands for Config Split for both Drush and Drupal console early on we faced this problem already more than a year ago. And now it has paid off because porting the commands to Drush 9 was very quick.

The solution is actually really simple and brings the added benefit of being able to test the business logic of the commands in the absence of Drush or Drupal console. It is all about separating the command discovery from the command logic. Drush 8, 9 and Drupal console all have a bit different ways to discover and invoke commands, but the business logic you want to implement is the same so all we have to do is to extract a common "interface" our custom service can implement and then make the command definitions wrap that and keep things DRY.

The CliService

Config Split defines a config_split.cli service with the class ConfigSplitCliService with all its dependencies injected. It has the methods \Drupal\config_split\ConfigSplitCliService::ioExport and \Drupal\config_split\ConfigSplitCliService::ioImport that implement all the commands logic and delegate the actual importing and exporting to specific methods.

The method signature for both the export and import method are more or less the same: CliService::ioMethod($arguments, $io, callable $t).

  • $arguments: The arguments passed to the command.
  • $io: This is an object that interacts with the command line, in Drush 9 and Drupal console this comes from the Symfony console component, for Drush 8 we created a custom wrapper around drush_confirm and drush_log called ConfigSplitDrush8Io.
  • $t: This is essentially a t function akin to how Drupal translates strings. Because Drupal console translates things differently we had to be a bit creative with that by adding a t method to the command.
Commands wrap the service

The Drush 8 command is essentially:

<?php
function drush_config_split_export($split = NULL) {
  // Make the magic happen.
  \Drupal::service('config_split.cli')->ioExport($split, new ConfigSplitDrush8Io(), 'dt');
}
?>

For Drush 9 we can use dependency injection and the Drush 9 command becomes essentially:

<?php
class ConfigSplitCommands extends DrushCommands {
  public function splitExport($split = NULL) {
    $this->cliService->ioExport($split, $this->io(), 'dt');
  }
}
?>

And very similar the Drupal console command:

<?php
class ExportCommand extends SplitCommandBase {
  protected function execute(InputInterface $input, OutputInterface $output) {
    $this->setupIo($input, $output);
    // Make the magic happen.
    $this->cliService->ioExport($input->getOption('split'), $this->getIo(), [$this, 't']);
  }
}
?> Testing

The ConfigSplitCliServiceTest is a KernelTest which asserts that the export works as expected by exporting to a virtual file system. The test coverage is not 100% (patches welcome) but the most important aspects for the complete and conditional splitting (blacklist/graylist) is thoroughly tested. There are no limitations on what or how you can test your CliService since it is self contained in your module and does not depend on Drush or the Drupal console. For example one could write a unit test with a mocked $io object that asserts that the messages printed to the cli are correct.

Tags: Drupal 8Drupal PlanetDrush

Here is another recipe for success. You can have a whole team of websites playing for you, and they don’t have to be created from scratch or managed separately. The secret lies in Drupal’s well-developed multisite functionality. Thanks to this, Drupal will not only let you leave your competitors behind, but also multiply this effect by many times.

Read more
A Homeowner's Guide to Drupal Security Working in our Managed Services department, we handle many Drupal 7 and 8 sites - all of which have one thing in common. Despite their different requirements, designs and content - they all need security updates applying and are all in need of some care and attention when it comes to securing them. If a Drupal site was a house: Securi...
Going Local with Lando Crell Thu, 10/12/2017 - 20:43 Blog

Platform.sh aims to be a complete solution for web development and hosting, while at the same time offering the flexibility to slot into your own development tools and methodologies. That's a tricky balance to strike at times: Providing a complete solution while offering maximum flexibility at the same time.

One area where we generally favor flexibility is in your local development environment. You can use whatever local development tools you're most comfortable with: MAMP, WAMP, VirtualBox, Docker, or just a native local install of your needed tools.

For those who fear analysis-paralysis from so many choices, though, we've decided to start reviewing and green-lighting recommended tools that we've found work well. And the first local development tool we can recommend is Lando.

Lando is a Docker-based local development environment that grew out of Kalabox, a VirtualBox-based local dev tool for Drupal. Lando is much more flexible and lighter-weight than a virtual machine-based solution, and has direct support for a variety of systems including Drupal, Laravel, Backdrop, and WordPress. It even goes beyond PHP with support for Node.js, Python, and Ruby as well, just as we do.

Like Platform.sh, Lando is controlled by a YAML configuration file. Although being Docker-based it cannot directly mimic how a Platform.sh project works, it can approximate it reasonably well.

We've included a recommended Lando configuration file in our documentation. It's fairly straightforward and easy to adapt for your particular application. It's also possible to synchronize data from a Platform.sh environment to your local Lando instance in just a few short commands. Lando's own documentation provides more details on how to trick out your local system with whatever you may need.

We still believe in allowing you to pick your own development workflow, so you don't have to change anything if you already have a workflow that works for you; if you want our advice, though, Lando is a solid option that should get you up and running locally in minutes, while Platform.sh handles all of your staging and production needs.

Larry Garfield 18 Oct, 2017

Not all CMS are created equal. Before building your next website, here are a few tips on why your CMS choice matters. (Plot twist: as told by an Account Manager.)

Read More
SEO Checklist Before Launching Your Drupal Website Dmitrii Susloparov Thu, 10/12/2017 - 17:11

Search Engine Optimization (SEO) might not be the first thing you think of when designing a new website, but building an optimized framework from the start will help you drive traffic to your site and keep it there. With our Drupal SEO-checklist in hand you can build an excellent website that draws customers from launch day. Briefly speaking, here is a bullet list of what to check before the launch day. Below we’ll speak about each point in more detail.

 

  • Check that all web pages have unique titles using the Page Title module

  • Check if XML Sitemap and Google News Sitemap are configured properly

  • Check if Redirect module is enabled and configured

  • Check if Global Redirect module is enabled and configured

  • Check that .htaccess redirect to site with/without www

  • Check that the homepage title includes a slogan, and is descriptive for the function of the site

  • Check if Meta Tags is filled with descriptive information

  • Check that OG tags are filled correctly and with descriptive information.

  • Check if site's information appears well when shared on Facebook

  • Check if Path aliases patterns are meaningful

  • Check if Google Analytics is enabled and configured

  • Check if Page Title module is enabled and configured

  • Check if Google News Sitemap is enabled and configured

  • Check if Site verification is enabled and configured

  • Check if Search 404 module is enabled and configured

 

Drupal SEO: 12 Things that Will Improve Your Site's Ranking Check that all web pages have unique titles...

...and make sure to write them correctly. All of your pages should be easily identifiable to the end user. Not only should they have unique titles, they should have meaningful titles. Having multiple pages with the same titles (like “Get in touch”, “Contact us” and “Make a booking”) will simply confuse your end users and search engine crawlers.

 

Not only do good page titles help customers who are already on your site, but they help with social sharing, and picking your site out of search engine results. Titles are the first element that any user will see, whether they come directly to your site, find it in a search engine, or see it shared on social media.

 

Writing good titles is extremely important, and having keywords in your title that match a user's search greatly improves the chances of them clicking on your page.

 

Ensuring all your pages have a unique name will help users navigate, boost your SEO ratings, and increase the chances that someone will type the right keywords into a search engine to bring them to your site.

 

You can set up unique page titles much easier if you install the Drupal Page Title module.

10 Drupal Modules that Will Boost Your Website’s SEO

 

 

Check if XML Sitemap and Google News Sitemap are configured properly

The XML Sitemap module creates a robot friendly map of your site that Google and other search engines can crawl to categorise your website. There are a few settings you can alter for your site at admin/config/search/xmlsitemap and you can view the sitemap from http://yoursite.com/sitemap.xml.

 

You should configure XML Sitemap early in your site build for the best effect, but you can also alter the settings later on if needed.

 

Google News Sitemap offers a similar but different service that creates a Google specific map - as suggested in the name. These two modules work nicely side by side to make your site easy for search engines to crawl and index.

 

Please note that if your site contains AMPs, there is no need to create sitemaps for them. The rel=amphtml link is enough for Google to pick up on the accelerated mobile page version, which means you can easily gain traffic from Top Stories carousels and mobile search. Creating AMP on your Drupal site became easy with our step-by-step guide.

 

 

Check if Redirect module is enabled and configured

Redirect is a handy module for making sure users always make it to your site. It uses case-insensitive matching to help catch broken links with redirects and tracks how often users are hitting those redirects. You can use redirects to capture any broken links, set up promotional links, or simply capture typos users are entering when trying to access your site.

 

Check if Global Redirect module is enabled and configured

If you’re using Drupal 8 you can skip this one because the functionality has been rolled into the redirect module. Otherwise install Global Redirect to work in tandem with Redirect to catch any broken links. Global Redirect will test all links with and without a trailing slash, ensure links are case-insensitive and if a link is truly broken it will return a user to your home page, rather than an ugly 404 page that decrease the position of your site in SERPs.

Check that .htaccess redirects to site with/without www

Some users attempting to visit your site will navigate to www.yoursite.com, while others will simply type yoursite.com. By setting up your site to handle either request you can be sure you won’t miss any visitors.

 

 

Check that the homepage title includes a headline, logo and primary image and is descriptive for the function of the site

The headline as well as the slogan represent who you are as a business. Make your first impression a good one as this will also be visible on search engines. This is a good opportunity to stack your website with SEO friendly keywords, but don’t go overboard and sacrifice your image for it - keyword stuffing may not only decrease the trust index of your site, but also its conversion rates.

Ensure Metatags are filled with descriptive information

Writing SEO-optimized metatags is highly important, because they remain one of the top on-page ranking factors. Make sure to install the Metatag module on your site to have an easy, user friendly interface for updating metadata. With the module installed you can easily populate metadata with keywords, page descriptions, and more.

 

SEO tips for your Drupal site

 

The Metatag module will also give you extra control over how your site appears when shared on Twitter or Facebook.

Check that OG tags are filled correctly and with descriptive information.

OG tags are metatags specifically designed to ensure your site communicates nicely with Facebook. By setting these tags correctly you will be able to control exactly how your site appears on Facebook, including what images and what taglines are used.

Check if site's information appears well when shared on Facebook and Twitter

After configuring the metatag module and OG tags, pop over to Facebook and make sure that your site shares the way you would like it too. It’s important to test this out now before users start sharing your site around.

 

Similarly try tweeting a couple of your pages to see how well your Twitter Cards come through. If you don’t want to show your site to your audience until you are sure it is set up properly, you can check Twitter Cards using the Card Validator.

 

For more information on configuring Twitter cards, check out the Twitter user guides.

 

Check if Path aliases patterns are meaningful

By default Drupal will set your URLs to node/123 - while this works great for the database back end, it doesn’t work well for your end users, or for search engines.

 

You can use the Pathauto module to create rules and patterns for your URLs that will significantly cut down on your maintenance times and simplify your site navigation.

Check if Google Analytics is enabled and configured

While having Google Analytics configured won’t improve your SEO, it will give you all the data you need to understand where your users are coming from and how they behave once they hit your site.

 

Installing the Google Analytics module makes setting up and configuring Google Analytics a breeze.

Check if Site verification is enabled and configured

The Site verification module makes it easy to check the boxes that tell search engines that your site is truly yours. Having your site verified will improved how search engines crawl your site, and for Google will allow you to access private search data. With site verification you will receive better data and better search engine rankings for just a few minutes work.

 

Check if Search 404 module is enabled and configured

The Search 404 module is a saving grace for reducing your bounce rate, your SEO and improving your customer experience. Instead of your users finding an ‘Error: Page not Found” in place of the content they were hoping for, they will be offered a search of your site based on the URL string. For example if “www.yoursite.com/great-seo-tips” doesn’t exist, users this module will automatically search your site for ‘Great SEO tips” and show the users the results.

 

 

Bottom line

While SEO may seem like a tricky subject to wrap your head around, the basics are easy with the right modules and the right guidance. Drupal is a great content management system for building search engine optimized websites.

 

With our SEO checklist you can get off on the right foot, and here at Vardot we love educating our customers to build top quality websites. If you’re looking for even more ways to improve your sites SEO, have a look at SEO articles in our blog or get in touch with us.

In my previous post, Modern Decoupling is More Performant, we discussed how saving HTTP round-trips has a very positive impact on performance. In particular, we demonstrated how the JSON API module could help your application by returning multiple entities in a single request. Doing so eliminates the need for making an individual request per entity. However, this is only possible when fetching entities, not when writing data and only if those entities are related to the entry point (a particular entity or collection).

Sometimes you can solve this problem by writing a custom resource in the back-end every time, but that can lead to many custom resources, which impacts maintainability and is tiresome. If your API is public and you don’t have prior knowledge of what the consumers are going to do with it, it’s not even possible to write these custom endpoints.

The Subrequests module completes that idea by allowing ANY set of requests to be aggregated together. It can aggregate them even when one of them depends on a previous response. The module works with any request, it's not limited to REST or any other constraint. For simplicity, all the examples here will make requests to JSON API.

Why Do We Need It?

The main concept of the Subrequests module is that instead of sending multiple requests to your Drupal instance we will only send a single request. In this master request, we will provide the information about the requests we need to make in a JSON document. We call this document blueprint.

A blueprint is a JSON document containing the instructions for Drupal to make all those requests in our name. The blueprint document contains a list of subrequest objects. Each subrequest object contains the information about a single request being aggregated in the blueprint.

Imagine that our consumer application has a decoupled editorial interface. This editorial interface contains a form to create an article. As part of the editorial experience, we want the form to create the article and a set of tags in the Drupal back-end.

Without using Subrequests, the consumer application should execute the following requests when the form is submitted:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user, based on the username present in the editorial app.
  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.
  • Create the article in the form using the user UUID and the newly created tags.

We can query for the user and the vocabulary in parallel. Once that is done, and using the information in the vocabulary response, we can create the tag entities. Once those are created, we can finally create the article. In total, we would be making five requests at three sequential levels. And, this is not even a complex example!

undefined

A JavaScript pseudo-code for the form submission handler could look like:

console.log('Article creation started…'); Promise.all([ httpRequest('GET', 'https://cms.contentacms.io/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags'), httpRequest('GET', 'https://cms.contentacms.io/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin'), ]) .then(res => { const [vocab, user] = res; return Promise.all([ Promise.resolve(user), httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag1, headers), httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag2, headers), ]) }) .then(res => { const [user, tag1, tag2] = res; const body = buildBodyForArticle(formData, user, tag1, tag2); return httpRequest('POST', 'https://cms.contentacms.io/api/articles', body, headers); }) .then(() => { console.log('Article creation finished!'); }); Using Subrequests

Our goal is to have JavaScript pseudo-code that looks like:

console.log('Article creation started…'); const blueprint = buildBlueprint(formData); httpRequest('POST', 'https://cms.contentacms.io/api/subrequests?_format=json', blueprint, headers) .then(() => { console.log('Article creation finished!'); });

We've reduced our application code to a single POST request that contains a blueprint in the request body. We have reduced the problem to the blueprint creation. That is a big improvement in the developer experience of consumer applications.

undefined Parallel Requests

In our current task we need to perform two initial HTTP requests that can be run in parallel:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user based on the username in the editorial app.

That translates to the following blueprint:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": ["Accept": "application/vnd.application+json"] }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": ["Accept": "application/vnd.application+json"] } ]

For each subrequest, we can observe that we are providing four keys:

  • requestId A string used to identify the subrequest. This is an arbitrary value generated by the consumer application.
  • action Identifies the action being performed. A "view" action will generate a GET request. A "create" action will generate a POST request, etc.
  • uri The URL where the subrequest will be sent .
  • headers An object containing the headers specific for this subrequest.

The response to this blueprint (after adjusting the permissions in Drupal to view users and vocabularies) will return the response to both subrequests:

{ "vocabulary": { "headers": { "content-id": ["<vocabulary>"], "status": [200] }, "body": "{\"data\":[{\"type\":\"vocabularies\",\"id\":\"47ce8895-0df6-44a4-af43-9ef3b2a924dd\",\"attributes\":{\"status\":true,\"dependencies\":{\"module\":[\"recipes_magazin\"]},\"_core\":\"HJlsFfKP4PFHK1ub6QCSNFmzAnGiBG7tnx53eLK1lnE\",\"name\":\"Tags\",\"vid\":\"tags\",\"description\":\"Use tags to group articles on similar topics into categories.\",\"hierarchy\":0,\"weight\":0},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies\\/47ce8895-0df6-44a4-af43-9ef3b2a924dd\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies?filter%5Bvid-filter%5D%5Bcondition%5D%5Bpath%5D=vid\\u0026filter%5Bvid-filter%5D%5Bcondition%5D%5Bvalue%5D=tags\"}}" }, "user": { "headers": { "content-id": ["<user>"], "status": [200] }, "body": "{\"data\":[{\"type\":\"users\",\"id\":\"a0b7af80-e319-4271-899f-f151d3fbfc8e\",\"attributes\":{\"internalId\":1,\"name\":\"admin\",\"mail\":\"admin@example.com\",\"timezone\":\"Europe\\/Madrid\",\"isActive\":true,\"createdAt\":\"2017-09-15T15:47:26+0200\",\"updatedAt\":\"2017-09-15T20:06:15+0200\",\"access\":1505565434,\"lastLogin\":\"2017-09-15T20:06:07+0200\"},\"relationships\":{\"roles\":{\"data\":[]}},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users\\/a0b7af80-e319-4271-899f-f151d3fbfc8e\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users?filter%5Badmin%5D%5Bcondition%5D%5Bpath%5D=name\\u0026filter%5Badmin%5D%5Bcondition%5D%5Bvalue%5D=admin\"}}" } }

In the (simplified) response above we can see that for each subrequest, we have one key in the response object. That key is the same as our requestId in the blueprint. Each one of the subresponses contains the information about the response headers and the response body. Note how the response body is an escaped JSON object.

This blueprint is not sufficient to create an article with two tags, but it's a great start. Let's build on top of that to create the tags and the article.

Dependent Requests

The next task we need to execute is the creation of the two tag entities:

  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.

To do this, we will need to expand the blueprint. However, we don't know the vocabulary UUID at the time we are writing the blueprint. What we do know is that the vocabulary UUID will be in the subresponse to the vocabulary subrequest. In particular, we can find the UUID in data[0].id.

We will use that information to create a blueprint that can create tags. Since we don't know the actual value of the vocabulary UUID, we will use a replacement token. At some point, during the blueprint processing by Drupal, the token will be resolved to the actual UUID value.

Replacement Tokens

We can use replacement tokens anywhere in the body or the URI of our subrequests. For those to be resolved, a token needs to be formatted in the following way:

{{<requestId>.<"body"|"headers">@<json-path-expression>}}

In particular, the replacement token for our vocabulary UUID will be:

{{vocabulary.body@$.data[0].id}}

What this replacement says is:

  1. Use the subresponse for the vocabulary subrequest.
  2. Take the body from that subresponse.
  3. Extract the string under data[0].id, by executing the JSON Path expression $.data[0].id. You can execute any JSON Path expression as long as it returns a string. JSON Path is a very powerful way to extract data from an arbitrary JSON object, in our case the body in subresponse to the vocabulary subrequest.

This is what our blueprint looks like after adding the subrequests to create the tag entities. Note the presence of the replacement tokens:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": {"Accept": "application/vnd.api+json"} }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": {"Accept": "application/vnd.api+json"} }, { "action": "create", "requestId": "tags-1", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "tags-2", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] } ]

Note that to use a replacement token in a subrequest, we need to add a dependency on the subresponse that contains the information. That's why we added the waitFor key in our tag subrequests.

Finishing the Blueprint undefined

Using the same principles that we used for the tags we can add the subrequest for:

  • Create the article in the form using the user UUID and the newly created tags.

That will leave our completed blueprint as:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": {"Accept": "application/vnd.api+json"} }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": {"Accept": "application/vnd.api+json"} }, { "action": "create", "requestId": "tags-1", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "tags-2", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "article", "headers": {"Content-Type": "application/vnd.api+json"}, "body": "{\"data\":{\"type\":\"articles\",\"attributes\":{\"body\":\"Custom value\",\"default_langcode\":\"1\",\"langcode\":\"en\",\"promote\":\"1\",\"status\":\"1\",\"sticky\":\"0\",\"title\":\"Article Created via Subrequests!\"},\"relationships\":{\"tags\":{\"data\":[{\"id\":\"{{tags-1.body@$.data.id}}\",\"type\":\"tags\"},{\"id\":\"{{tags-2.body@$.data.id}}\",\"type\":\"tags\"}]},\"type\":{\"data\":{\"id\":\"article\",\"type\":\"contentTypes\"}},\"owner\":{\"data\":{\"id\":\"{{user.body@$.data[0].id}}\",\"type\":\"users\"}}}}}", "uri": "/api/articles", "waitFor": ["user", "tags-1", "tags-2"] } ] More Powerful Replacements

Imagine that instead of creating an article for a single user, we wanted to create an article for each one of the users on the site. We cannot write a simple blueprint, like the one above, since we don't know how many users there are in the Drupal site. Hence, we cannot write an article creation subrequest for each user.

To solve this problem we can tweak the user subrequest, so instead of returning a single user it returns all the users in the site:

[ … { "requestId": "user", "action": "view", "uri": "/api/users", "headers": {"Accept": "application/vnd.api+json"} }, … ]

Then in our replacement tokens, we can write a JSON Path expression that will return a list of user UUIDs, instead of a single string. Subrequests will accept JSON Path expressions that return either strings or an array of strings for the replacement tokens.

In our article creation subrequest we will need to change {{user.body@$.data[0].id}} by {{user.body@$.data[*].id}}. The Subrequests module will create a duplicate of the article subrequest for each replacement item. In our case this will have the effect of having a copy of the article creation subrequest per each available user in the user subresponse.

The Final Response

The modified blueprint that generates one article per user will have a response like:

undefined

We can see how a single subrequest can generate n subresponses, and we can use each one of those to generate n other subresponses, etc. This highlights how powerful this technique is. In addition, we have seen that we can combine different type of operations. In our example, we mixed GET and POST in a single blueprint (to get the vocabulary and create the new tags).

Conclusion

Sub requests is a great way to fetch or write many resources in a single HTTP request. This allows us to improve performance significantly while maintaining almost the same flexibility that custom code provides.

Further Your Understanding

If you want to know more about the blueprint format you can read the specification. The Subrequests module comes with a JSON schema that you can use to validate your blueprint. You can find the schema here.

The hero image was downloaded from Frankenphotos and use without modifications with a CC BY 3.0 license.

With cybercrime on the rise, securing data in Drupal has become a hot topic for developers and project stakeholders alike.

In our latest webinar, we were joined by three Drupal security experts from Townsend Security, Lockr and Mediacurrent who shared their approach for building a secure groundwork to protect site data in Drupal.

Four months ago, I shared that Acquia was on the verge of a shift equivalent to the decision to launch Acquia Fields and Drupal Gardens in 2008. As we entered Acquia's second decade, we outlined a goal to move from content management to data-driven customer journeys. Today, Acquia announced two new products that support this mission: Acquia Journey and Acquia Digital Asset Manager (DAM).

Last year on my blog, I shared a video that demonstrated what is possible with cross-channel user experiences and Drupal. We showed a sample supermarket chain called Gourmet Market. Gourmet Market wants its customers to not only shop online using its website, but to also use Amazon Echo or push notifications to do business with them. The Gourmet Market prototype showed an omnichannel customer experience that is both online and offline, in store and at home, and across multiple digital touchpoints. The Gourmet Market demo video was real, but required manual development and lacked easy customization. Today, the launch of Acquia Journey and Acquia DAM makes building these kind of customer experiences a lot easier. It marks an important milestone in Acquia's history, as it will accelerate our transition from content management to data-driven customer journeys.

Introducing Acquia Journey

I've written a great deal about the Big Reverse of the Web, which describes the transition from "pull-based" delivery of the web, meaning we visit websites, to a "push-based" delivery, meaning the web comes to us. The Big Reverse forces a major re-architecture of the web to bring the right information, to the right person, at the right time, in the right context.

The Big Reserve also ushers in the shift from B2C to B2One, where organizations develop a one-to-one relationship with their customers, and contextual and personalized interactions are the norm. In the future, every organization will have to rethink how it interacts with customers.

Successfully delivering a B2One experience requires an understanding of your user's journey and matching the right information or service to the user's context. This alone is no easy feat, and many marketers and other digital experience builders often get frustrated with the challenge of rebuilding customer experiences. For example, although organizations can create brilliant campaigns and high-value content, it's difficult to effectively disseminate marketing efforts across multiple channels. When channels, data and marketing software act in different silos, it's nearly impossible to build a seamless customer experience. The inability to connect customer profiles and journey maps with various marketing tools can result in unsatisfied customers, failed conversion rates, and unrealized growth.

Acquia Journey delivers on this challenge by enabling marketers to build data-driven customer journeys. It allows marketers to easily map, assemble, orchestrate and manage customer experiences like the one we showed in our Gourmet Market prototype.

It's somewhat difficult to explain Acquia Journey in words — probably similar to trying to explain what a content management system does to someone who has never used one before. Acquia Journey provides a single interface to define and evaluate customer journeys across multiple interaction points. It combines a flowchart-style journey mapping tool with unified customer profiles and an automated decision engine. Rules-based triggers and logic select and deliver the best-next action for engaging customers.

One of the strengths of Acquia Journey is that it integrates many different technologies, from marketing and advertising technologies to CRM tools and commerce platforms. This makes it possible to quickly assemble powerful and complex customer journeys.

Acquia Journey will simplify how organizations deliver the "best next experience" for the customer. Providing users with the experience they not only want, but expect will increase conversion rates, grow brand awareness, and accelerate revenue. The ability for organizations to build more relevant user experiences not only aligns with our customers' needs but will enable them to make the biggest impact possible for their customers.

Acquia's evolving product offering also puts control of user data and experience back in the hands of the organization, instead of walled gardens. This is a step toward uniting the Open Web.

Introducing Acquia Digital Asset Manager (DAM)

Digital asset management systems have been around for a long time, and were originally hosted through on-premise servers. Today, most organizations have abandoned on-premise or do-it-yourself DAM solutions. After listening to our customers, it became clear that large organizations are seeking a digital asset management solution that centralizes control of creative assets for the entire company.

Many organizations lack a single-source of truth when it comes to managing digital assets. This challenge has been amplified as the number of assets has rapidly increased in a world with more devices, more channels, more campaigns, and more personalized and contextualized experiences. Acquia DAM provides a centralized repository for managing all rich media assets, including photos, videos, PDFs, and other corporate documents. Creative and marketing teams can upload and manage files in Acquia DAM, which can then be shared across the organization. Graphic designers, marketers and web managers all have a hand in translating creative concepts into experiences for their customers. With Acquia DAM, every team can rely on one dedicated application to gather requirements, share drafts, consolidate feedback and collect approvals for high-value marketing assets.

On top of Drupal's asset and media management capabilities, Acquia DAM provides various specialized functionality, such as automatic transcoding of assets upon download, image and video mark-up during approval workflows, and automated tagging for images using machine learning and image recognition.

By using a drag-and-drop interface on Acquia DAM, employees can easily publish approved assets in addition to searching the repository for what they need.

Acquia DAM seamlessly integrates with both Drupal 7 and Drupal 8 (using Drupal's "media entities"). In addition to Drupal, Acquia DAM is built to integrate with the entirety of the Acquia Platform. This includes Acquia Lift and Acquia Journey, which means that any asset managed in the Acquia DAM repository can be utilized to create personalized experiences across multiple Drupal sites. Additionally, through a REST API, Acquia DAM can also be integrated with other marketing technologies. For example, Acquia DAM supports designers with a plug in to Adobe Creative Cloud, which integrates with Photoshop, InDesign and Illustrator.

Acquia's roadmap to data-driven customer journeys

Throughout Acquia's first decade, we've been primarily focused on providing our customers with the tools and services necessary to scale and succeed with content management. We've been very successful with helping our customers scale and manage Drupal and cloud solutions. Drupal will remain a critical component to our customer's success, and we will continue to honor our history as committed supporters of open source, in addition to investing in Drupal's future.

However, many of our customers need more than content management to be digital winners. The ability to orchestrate customer experiences using content, user data, decisioning systems, analytics and more will be essential to an organization's success in the future. Acquia Journey and Acquia DAM will remove the complexity from how organizations build modern digital experiences and customer journeys. We believe that expanding our platform will be good not only for Acquia, but for our partners, the Drupal community, and our customers.

Drupal Camp Dublin is Next Week - Last Chance for Tickets

Seems like just yesterday since we held DrupalCon in Dublin, now we're back with our annual Drupal Camp Dublin.

markconroy Wed, 10/11/2017 - 19:44

This year's Drupal Camp Dublin has a great line up of speakers from Ireland and abroad, covering such topics as:

  • Building multi-lingual, multi-region websites (Stella Power)
  • Working as a developer with attention-deficit disorder - add (Levi Govaerts)
  • Planning for disruptions (Jochen Lillich)
  • Migrating from Drupal 4 to 5 to 6 to 7 to 8 (Alan Burke)
  • Automating deployments (Luis Rodriguez)
  • Working webform and commerce and paragraphs and display suites and more (Chandeep Khosa)
  • Live debugging a site that's giving issues (Anthony Lindsay)
  • Deploy with Fabric, and test driven development (Oliver Davies)
  • Design in the Browser (yours truly, me, Mark Conroy)
  • Teaching web development at third level (Ruairi O'Reilly)
  • The QA process (Daniel Shaw)
  • Getting started with Docker (Ed Crompton)
  • The new theme coming to Drupal core (Mark Conroy)

And then there's some socials, and our Drupal Ireland AGM, and at least one other talk not announced yet, and ... you get the idea.

The full schedule is available on our website. There are some tickets left (only €20), get them before they are all gone.

Today, there was a Moderately Critical security advisory for an Access Bypass vulnerability in the netFORUM Authentication module for Drupal 7:

netFORUM Authentication - Moderately critical - Access Bypass - SA-CONTRIB-2017-077

The module was bypassing protections on the Drupal 7 user login form, to deter brute force attempts to login to the site, and so was an Access Bypass vulnerability by making login less secure when using this module.

However, Drupal 6 (including Pressflow 6) don't have these same protections for the user login form, and so, using this module is no less secure than using vanilla Drupal 6. Of course, these protections could be added to this module, and while this would be great security hardening, this doesn't represent a vulnerability - only a weakness which is also present (and widely known) in Drupal 6 core.

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mediacurrent has been selected as finalists for the 2017 Acquia Engage Awards in the categories of Financial Services, Travel and Tourism, and Digital Experience. These awards recognize the amazing sites and digital experiences that leading digital agencies are building with the Acquia Platform.

This blog has been re-posted with permission from Dries Buytaert's blog. Please leave your comments on the original post.

Last week at DrupalCon Vienna, I proposed adding a modern JavaScript framework to Drupal core. After the keynote, I met with core committers, framework managers, JavaScript subsystem maintainers, and JavaScript experts in the Drupal community to discuss next steps. In this blog post, I look back on how things have evolved, since the last time we explored adding a new JavaScript framework to Drupal core two years ago, and what we believe are the next steps after DrupalCon Vienna.

As a group, we agreed that we had learned a lot from watching the JavaScript community grow and change since our initial exploration. We agreed that today, React would be the most promising option given its expansive adoption by developers, its unopinionated and component-based nature, and its well-suitedness to building new Drupal interfaces in an incremental way. Today, I'm formally proposing that the Drupal community adopt React, after discussion and experimentation has taken place.

Two years ago, it was premature to pick a JavaScript framework

Three years ago, I developed several convictions related to "headless Drupal" or "decoupled Drupal". I believed that:

  1. More and more organizations wanted a headless Drupal so they can use a modern JavaScript framework to build application-like experiences.
  2. Drupal's authoring and site building experience could be improved by using a more modern JavaScript framework.
  3. JavaScript and Node.js were going to take the world by storm and that we would be smart to increase the amount of JavaScript expertise in our community.

(For the purposes of this blog post, I use the term "framework" to include both full MV* frameworks such as Angular, and also view-only libraries such as React combined piecemeal with additional libraries for managing routing, states, etc.)

By September 2015, I had built up enough conviction to write several long blog posts about these views (post 1, post 2, post 3). I felt we could accomplish all three things by adding a JavaScript framework to Drupal core. After careful analysis, I recommended that we consider React, Ember and Angular. My first choice was Ember, because I had concerns about a patent clause in Facebook's open-source license (since removed) and because Angular 2 was not yet in a stable release.

At the time, the Drupal community didn't like the idea of picking a JavaScript framework. The overwhelming reactions were these: it's too early to tell which JavaScript framework is going to win, the risk of picking the wrong JavaScript framework is too big, picking a single framework would cause us to lose users that favor other frameworks, etc. In addition, there were a lot of different preferences for a wide variety of JavaScript frameworks. While I'd have preferred to make a bold move, the community's concerns were valid.

Focusing on Drupal's web services instead

By May of 2016, after listening to the community, I changed my approach; instead of adding a specific JavaScript framework to Drupal, I decided we should double down on improving Drupal's web service APIs. Instead of being opinionated about what JavaScript framework to use, we would allow people to use their JavaScript framework of choice.

I did a deep dive on the state of Drupal's web services in early 2016 and helped define various next steps (post 1, post 2, post 3). I asked a few of the OCTO team members to focus on improving Drupal 8's web services APIs; funded improvements to Drupal core's REST API, as well as JSON API, GraphQL and OpenAPI; supported the creation of Waterwheel projects to help bootstrap an ecosystem of JavaScript front-end integrations; and most recently supported the development of Reservoir, a Drupal distribution for headless Drupal. There is also a lot of innovation coming from the community with lots of work on the Contenta distribution, JSON API, GraphQL, and more.

The end result? Drupal's web service APIs have progressed significantly the past year. Ed Faulkner of Ember told us: "I'm impressed by how fast Drupal made lots of progress with its REST API and the JSON API contrib module!". It's a good sign when a core maintainer of one of the leading JavaScript frameworks acknowledges Drupal's progress.

The current state of JavaScript in Drupal

Looking back, I'm glad we decided to focus first on improving Drupal's web services APIs; we discovered that there was a lot of work left to stabilize them. Cleanly integrating a JavaScript framework with Drupal would have been challenging 18 months ago. While there is still more work to be done, Drupal 8's available web service APIs have matured significantly.

Furthermore, by not committing to a specific framework, we are seeing Drupal developers explore a range of JavaScript frameworks and members of multiple JavaScript framework communities consuming Drupal's web services. I've seen Drupal 8 used as a content repository behind Angular, Ember, React, Vue, and other JavaScript frameworks. Very cool!

There is a lot to like about how Drupal's web service APIs matured and how we've seen Drupal integrated with a variety of different frameworks. But there is also no denying that not having a JavaScript framework in core came with certain tradeoffs:

  1. It created a barrier for significantly leveling up the Drupal community's JavaScript skills. In my opinion, we still lack sufficient JavaScript expertise among Drupal core contributors. While we do have JavaScript experts working hard to maintain and improve our existing JavaScript code, I would love to see more experts join that team.
  2. It made it harder to accelerate certain improvements to Drupal's authoring and site building experience.
  3. It made it harder to demonstrate how new best practices and certain JavaScript approaches could be leveraged and extended by core and contributed modules to create new Drupal features.

One trend we are now seeing is that traditional MV* frameworks are giving way to component libraries; most people seem to want a way to compose interfaces and interactions with reusable components (e.g. libraries like React, Vue, Polymer, and Glimmer) rather than use a framework with a heavy focus on MV* workflows (e.g. frameworks like Angular and Ember). This means that my original recommendation of Ember needs to be revisited.

Several years later, we still don't know what JavaScript framework will win, if any, and I'm willing to bet that waiting two more years won't give us any more clarity. JavaScript frameworks will continue to evolve and take new shapes. Picking a single one will always be difficult and to some degree "premature". That said, I see React having the most momentum today.

My recommendations at DrupalCon Vienna

Given that it's been almost two years since I last suggested adding a JavaScript framework to core, I decided to talk bring the topic back in my DrupalCon Vienna keynote presentation. Prior to my keynote, there had been some renewed excitement and momentum behind the idea. Two years later, here is what I recommended we should do next:

  • Invest more in Drupal's API-first initiative. In 2017, there is no denying that decoupled architectures and headless Drupal will be a big part of our future. We need to keep investing in Drupal's web service APIs. At a minimum, we should expand Drupal's web service APIs and standardize on JSON API. Separately, we need to examine how to give API consumers more access to and control over Drupal's capabilities.
  • Embrace all JavaScript frameworks for building Drupal-powered applications. We should give developers the flexibility to use their JavaScript framework of choice when building front-end applications on top of Drupal — so they can use the right tool for the job. The fact that you can front Drupal with Ember, Angular, Vue, React, and others is a great feature. We should also invest in expanding the Waterwheel ecosystem so we have SDKs and references for all these frameworks.
  • Pick a framework for Drupal's own administrative user interfaces. Drupal should pick a JavaScript framework for its own administrative interface. I'm not suggesting we abandon our stable base of PHP code; I'm just suggesting that we leverage JavaScript for the things that JavaScript is great at by moving relevant parts of our code from PHP to JavaScript. Specifically, Drupal's authoring and site building experience could benefit from user experience improvements. A JavaScript framework could make our content modeling, content listing, and configuration tools faster and more application-like by using instantaneous feedback rather than submitting form after form. Furthermore, using a decoupled administrative interface would allow us to dogfood our own web service APIs.
  • Let's start small by redesigning and rebuilding one or two features. Instead of rewriting the entirety of Drupal's administrative user interfaces, let's pick one or two features, and rewrite their UIs using a preselected JavaScript framework. This allows us to learn more about the pros and cons, allows us to dogfood some of our own APIs, and if we ultimately need to switch to another JavaScript framework or approach, it won't be very painful to rewrite or roll the changes back.
Selecting a JavaScript framework for Drupal's administrative UIs

In my keynote, I proposed a new strategic initiative to test and research how Drupal's administrative UX could be improved by using a JavaScript framework. The feedback was very positive.

As a first step, we have to choose which JavaScript framework will be used as part of the research. Following the keynote, we had several meetings at DrupalCon Vienna to discuss the proposed initiative with core committers, all of the JavaScript subsystem maintainers, as well as developers with real-world experience building decoupled applications using Drupal's APIs.

There was unanimous agreement that:

  1. Adding a JavaScript framework to Drupal core is a good idea.
  2. We want to have sufficient real-use experience to make a final decision prior to 8.6.0's development period (Q1 2018). To start, the Watchdog page would be the least intrusive interface to rebuild and would give us important insights before kicking off work on more complex interfaces.
  3. While a few people named alternative options, React was our preferred option, by far, due to its high degree of adoption, component-based and unopinionated nature, and its potential to make Drupal developers' skills more future-proof.
  4. This adoption should be carried out in a limited and incremental way so that the decision is easily reversible if better approaches come later on.

We created an issue on the Drupal core queue to discuss this more.

Conclusion

Drupal should support a variety of JavaScript libraries on the user-facing front end while relying on a single shared framework as a standard across Drupal administrative interfaces.

In short, I continue to believe that adopting more JavaScript is important for the future of Drupal. My original recommendation to include a modern JavaScript framework (or JavaScript libraries) for Drupal's administrative user interfaces still stands. I believe we should allow developers to use their JavaScript framework of choice to build front-end applications on top of Drupal and that we can start small with one or two administrative user interfaces.

After meeting with core maintainers, JavaScript subsystem maintainers, and framework managers at DrupalCon Vienna, I believe that React is the right direction to move for Drupal's administrative interfaces, but we encourage everyone in the community to discuss our recommendation. Doing so would allow us to make Drupal easier to use for site builders and content creators in an incremental and reversible way, keep Drupal developers' skills relevant in an increasingly JavaScript-driven world, move us ahead with modern tools for building user interfaces.

Special thanks to Preston So for contributions to this blog post and to Matt Grill, Wim Leers, Jason Enter, Gábor Hojtsy, and Alex Bronstein for their feedback during the writing process.

The Hitchhiker's Guide to the Planet Drupal christophe Tue, 10/10/2017 - 22:24 In this newcomer guide, you will find:
  • How to accelerate the onboarding process and how to get a fresh Drupal 8 install, for testing.
  • The documentation reduced to the essential for the following topics: tools, projects, Drupal concepts and drupalisms, main events, contribution and service providers.
  • A brief comparison of other solutions, and when to use Drupal.

There are at least 42 reasons to onboard drupalship.org!

 

 

To say that payment gateways are much improved in Commerce 2.x is a bit of an understatement. The process of implementing a payment gateway has been cut down to about a third of the time, with more functionality rather than less.

5 Ways Web Development Project Management Will Make Your Project More Successful 5 Ways Web Development Project Management Will Make Your Project More Successful Lily Berman Tue, 10/10/2017 - 10:45

As account managers at Elevated Third, we manage many projects across our accounts. Web development project management is intangible though not unimportant. We do not create wireframes or write code, so our direct impact on the Drupal websites Elevated Third produces may be less clear to our clients.

During the sales process, some clients see their communication budget as an unnecessary expense. Similar to limiting overhead spending when choosing recipients for charitable organizations, limiting the communication budget means more time goes to execution, right?

Maybe not. In the same way that a successful benefit event can dramatically increase the funds available for a nonprofit’s mission, strong account management directly contributes to our clients achieving their business goals across projects.

So, how does an account manager foster a successful Drupal project at Elevated Third?

  1) Account managers are the single consistent knowledge holder throughout the life of a project

Our team’s level of involvement will vary throughout a project. While UX has a large impact in the beginning, developers complete the majority of the tasks at the end. The account manager is the only member of the team who is in every meeting from kickoff to launch. It can be frustrating (and often expensive) for clients when the team veers from their vision. As a consistent project knowledge holder, an account manager can guide the team to ensure that they are considering the big picture, even when the client is not in the room.

For Instance: A designer knows he needs to create visual design for the project. He reviews what he believes is the necessary documentation, but did not see the client’s email update describing her new brand direction. He spends hours designing with the original brand guidelines in mind, then presents it to the client. The client is then frustrated that her feedback was not implemented and additional hours will be needed to modify the design. As our contracts are time and materials, every additional hour spent on a project has a corresponding cost to our clients.

When an account manager is involved in a project, she is part of every conversation and reviews every client email. This means no feedback will get lost in translation and costly adjustments will be avoided. Account managers are not responsible for creating any element of the website, so we can focus on ensuring that our clients and end users are kept in mind in every meeting and throughout the whole project.

  2) Account managers keep budget and timeline top of mind

A core part of the account manager’s role is managing the client’s budget and timeline. No other member of the team has that responsibility. We balance designers and developers who, if given a chance, would often prefer to build the most beautiful, perfect user-friendly functionality. Their desire to build the best thing ever is valuable, but it has to be balanced with the client’s budget and timeline needs. The account manager sets deadlines and monitors burndown throughout the project. From early discussions of which features will be prioritized to consistent check-ins and tweaks throughout execution, account managers ensure that the project aligns with the established constraints.

For instance: A UX strategist, excited about how valuable the tool we are building will be for its users, starts planning her user testing. She creates a first round of prototypes and tests with five users. Their feedback is so beneficial, she creates another iteration of prototypes to test with another five users, and then tests a third. Although she has gained valuable insight, she has now used half of the project hours that were allocated for visual design, as the budget did not accommodate extensive user testing. When an account manager takes on the role of web development project management, she knows the scope and the hours that are allocated for each task. She completes a variety of checks and balances to ensure the execution aligns with the project constraints.

  3) Account managers communicate with clients and with the team

Custom web development can often be a mysterious and complex process. Luckily, an account manager has learned to translate jargon for our clients. As a result of working in this industry, we understand the terminology used along with the impact of the choices we are asking our clients to make. Not only do we coordinate meetings and send status updates to keep clients in the loop, but we are also uniquely equipped to ensure they understand the process. This means that our team can stay focused on their tasks and more efficiently complete work with minimal interruptions.

For Instance: A developer has spent an hour working on a very complex task. Knowing that he needs to maximize concentration and minimize interruptions, he silences all of his notifications. This practice, called going “heads down,” is common when tackling problem-solving tasks. During this time, a client reaches out with an extremely urgent issue. Since he is the only person available to answer her request, it lingers for hours before she receives a response. For some development-related issues, especially on a live site, this delay can dramatically impact the client’s bottom line. When an account manager is involved in the project, she can immediately alert the developer of the request and let the client know her concern is being addressed right away.

  4) Account managers are organization wizards

For all projects, but especially for complex projects, there can be a lot of documentation. Luckily, account managers choose this field because we love organizing chaos. This skill helps our team work faster throughout the course of a project. Although a client rarely sees our organization and management of tasks and documentation, they will see the benefits of more accurate work and increased efficiency across teams.

For Instance: A developer knows that she needs to reference a particular piece of documentation for the element of the site she is building today, but she can’t find it. She spends 15 minutes digging through folders to find what she needs, which seems to happen every time she completes a task. When an account manager is involved in a project, she knows what documentation the developer will need, so she has already attached it to the current task, saving the developer time.

 

5) Account managers are flexible and adapt their skills to maximize their value

Every other role on a project is clear. A UX strategist helps to define which features will best achieve the business goals and how to maximize a user’s experience of interacting with them. A designer crafts how they will appear. A developer builds them. An account manager’s role in web development project management is less clear. When people ask me what I do on a typical day, my answer often comes after a long pause, and it’s rarely the same. Many others in my field find it difficult to describe their role succinctly, as our work can vary dramatically from day to day and from project to project.

For instance: Some days, my role is quite technical, and I am preparing or reviewing project documentation or checking the quality of completed development tasks. Other days, my role is more interpersonal, and I am supporting my team in delivering their best work or in back-to-back meetings with my clients. With each project comes a new business to learn, often along with new technologies and additional nuance to my role. To be successful, I am always switching between the various priorities outlined here, along with many more.

 

At Elevated Third, we value our clients’ investment in our work and are always evolving to maximize the value of that investment. We build communication time into our projects because we know how invaluable strong account managers are to ensuring our Drupal websites generate the outcomes our clients value most.

As a Drupal expert, many of the projects I’ve done over the years have been marketing websites. Drupal is widely understood as a content management system that’s used to power sites like ours, but this is actually only the tip of the iceberg of what Drupal can do for an organization. Our team has used Drupal to build a variety of complex custom web applications that help companies work more efficiently.

Do you need an intranet?

We’ve used Drupal to build intranets that securely keep internal content and documents for staff eyes only. Drupal has an abundance of community features that make it easy to have wikis, commenting, user profiles, and messaging. Many organizations we’ve worked with integrate their intranet with their LDAP or other Single Sign On system. 

Radial's intranet allows team members to quickly locate information about co-workers

We’ve also used Drupal for our own intranet for the past eight years. Our intranet helps keep our internal knowledge base easy to access and organizes information like our servers, sites, clients, and projects.

Read more

آخرین ارسال ها

محتواهای محبوب

درباره ما

Author
اینجا دروپال یعنی همه چیز. در مورد دروپال صحبت میکنیم. ماژول هامون رو به اشتراک میزاریم در مورد قالب دروپال ، فروشگاه دروپال، دروپال فارسی و تاریخ شمسی دروپال صحبت میکنیم و هرچیزی که در مورد طراحی سایت با دروپال میدونیم به هم انتقال میدیم. دروپالیون یک سایت شخصی نیست. ما دست همه کسانی که برای پیشرفت دروپال تلاش میکنند رو میفشاریم و با آغوش باز اونها رو در این سایت میپذیریم.

تماس با ما

با ما تماس بگیرید.

logo-samandehi