Starting Affordable Professsional Web Design Services

So I’ve finally come around (free from college and so) to start my own Fiverr Gig (View Here)

The point in this being that I’ve got pretty good in designing websites and so using modern technology and I believe that having a personalized website shouldn’t be so difficult for anyone. Furthermore I think this is a good opportunity to test my customer service skills (lol) and client handling.

Let’s see how this rolls. ¯\_(ツ)_/¯

Advertisements

Skraypar: Pattern parsing with Iterators and Look Aheads

You’ll often be told not to parse HTML with RegEx – but what if you’re a rebel?

WHY YOU SHOULDN’T PARSE HTML WITH REGEX

Clicky.

WHY YOU COULD PARSE WITH REGEX

Parsing from static templates is pretty easy with RegEx and quite simple. The basic course of action is matching a line with what you’d want to match and either add grouping selectors in the RegEX or get your hands dirty and polish the data from that abhorrent line of HTML.


I made a successful RESTful service, Jikan.moe, using nothing but RegEx. This didn’t require any extra dependencies, libraries, yadda yadda. Neither was speed a concern since the parse was pretty quick.


 

 

What am I going on about?

Enter;Skraypar

With a terrible choice of a name, I began to simplify my repetitive tasks while parsing HTML using RegEx which consists of RegEx/pattern matching, loops, and so on.

Skraypar is an abstract PHP class which works by parsing by pattern matching, Iterators and Look Aheads’.

The parsing tasks split into 2.

  • (Inception) Pattern matching & callback on the line of match – Iterators
  • Additional pattern matching and callbacks within Iterators for dynamic HTML location – Look Aheads’

 

Think of it as the Iterator matching a table, and another Iterator matching the rows and the Look Aheads’ parsing the cells.

This is a pretty abstract and experimental project, I won’t blame you if you think I’ve gone mad. But heck – finding new ways to do things is one thing I like to do.

 

How does it work

1 – File Loading

Skraypar uses Guzzle as a dependency to fetch the HTML or if it’s a local file, it simply loads it. The file is loaded into an array, each line means each new index.

1B – Accepting & Rejecting

Fetching from the interwebs means you get to tell Skraypar which HTTP responses to allow and which ones to throw an exception at. By default, 200 (OK) and 303 (Forwarding) are accepted HTTP responses.

2 – Rules

When you extend a class with Skraypar, you’ve to set a method namely, loadRules, with added rules for Skraypar to remember when parsing.


Rules are patterns and callback functions for that pattern match. They loop at every line of code and if there’s a match and a callback executes – that particular rule is disabled.


3 – Iterators

Iterators are used inside of Rule Callbacks, by setting a breakpoint pattern and a callback pattern; the Iterator loops over each line executing a pattern match or Look Aheads until that breakpoint pattern is reached.

If breakpoint pattern is not found, Skraypar throws an exception that the parser failed by pointing to an unset offset in the array of lines from the file (since it increments)

There can be Iterators within Iterators.

4 – Look Aheads

Look Aheads are used inside Iterators. Usually, one could simply access a data on the next line given a pattern match for a line by incrementing the iterator count by 1. But in given cases, the data may not be available on the next line rather on the offset of 2 lines. This is a dynamic location for the data that is being parsed, hence a Look Ahead method basically looks for a pattern of that dynamically located data and parses it with a function callback.

5 – References

Everything is passed, controlled and set by references within the Iterator callables. You can pass a reference of the Iterator itself within it’s own callable to access setting responses or using the Look Ahead method of the Iterator Class or manually setting the iterator count property to an offset.


That’s pretty much it. This project is in development and is to be used as a dependency for the next major Jikan release. It’s not limited to Jikan, it can be used on any website or file.

 

No documentation is available at the moment.


Links

Jikan News & Updates – Mid-2018

Okay, this news is almost a month old. Here goes.


Already 5 months into 2018 and I’ve already exciting news regarding Jikan. I wrote a post back in January – laying out the road map of Jikan for the current year. I had announced 4 more features that were to be done this year. I’ve completed 3 of them with User Related scraping to be done by the release of REST 2.3.

 

RELATED

 

Over the past year, Jikan has gained a huge traction, client and development wise. Here are the highlights of the past 6 months.

Jikan REST 2.2

With the release of REST 2.2, came many new features.

  1. More extended data for Anime and Manga (with the exception of reviews & recommendations – for now)
  2. Anime/Manga/People/Characters Search! This comes with advanced search filters and pagination support.
  3. Top Anime and Manga with advanced filters
  4. Season – To list the Anime airing this season and for other years/seasons.
  5. Schedule – Anime scheduling for the week for this season
  6. Meta – Experimental requests for getting usage stats for Jikan and most requested links by daily, weekly & monthly periods.

 

And some service changes.

  1. Jikan has moved domain to Jikan.moe. The previous (Jikan.me) domain has been discontinued.
  2. Jikan REST API is now being hosted in Tokyo (closer to MyAnimeList’s Tokyo server) by an awesome dude called Hibiki.

 

100% Jikan Open Source

That’s right. The entirety of Jikan has been open-sourced under MIT License. This includes the website, docs and REST API service.

This not only adds flexibility, but the code is easier to manage and deploy. There goes the days of patches having to wait till the next REST version. Now the RESTful services is updated as soon as a new JikanPHP version is out – this ofcourse will vary for major feature releases as I’ve to set up the controllers on the REST service.

 

Usage Stats

This is the Meta feature I mentioned.

 

It works by logging requests made in Redis and increasing the respective counters for that request. Here are some interesting usage links.

You can read more about the further usability.

 

Late 2018 Roadmap (REST 2.3)

So here’s a few stuff that will definitely be completed before the end of 2018. Perhaps in the upcoming months.

  • Top Characters/People
  • Anime/Manga Extended Data – Reviews & Recommendations
  • User Data – Profile, Watch History, Friends

 

Early 2019

This is given if the MyAnimeList’s new API hasn’t been publicly released yet or people haven’t started ditching Jikan.

  • JikanPHP (Core) – Rewrite. This will introduce JikanPHP 2.X.
    • Separation of the parser as an abstraction class for Requests & RegEx parsing
    • Faster Parsing – Rework Extended Requests.
  • Jikan REST 3.0 – Given the crazy amount of requests we’ve been gettings. The main problem is rate limiting from MyAnimeList since we’re making all these requests from one server, i.e one IP Address.
    • Rework Redis Database data caching
    • API Keys. Note: This won’t replace free, unmonitored GET requests. The current limit of 5,000 will be lowered down to encourage app/project developers to get an API key that will support higher rate limits.
    • Rework Extended Requests as separate API calls. This is a bottleneck right now as extended requests make 2 requests instead of one to merge the data for you into 1 request.
  • Relational data – Expand to other sites (maybe)

Playing with Browser Extensions

So I had to test out the usability of my REST API and what’s a better way than developing an app that does it? Of course, being limited with app development  skills,  I turned to something more easier that I, atleast, have some skills in; Browser Addons/Extensions.

Now one thing I learned was that developing a  browser addons is mostly the same as making a web application. Except you need to have a manifest file and have it compiled by your browser into a proper extension. That’s pretty much it.

Enter;Anime Info

The reason why it’s such a generic name was because I had thought of the possibility of releasing it into the wild market, free to download and use. So the name would easily be search-able and would get SEO points. But the thing is it’s been about a month and Opera’s market takes almost forever to review an Addon and Chrome and Firefox have you make an initial payment to start putting addons/apps on their market, and I don’t have an card available at the moment so ¯\_(ツ)_/¯.

screen1screen2screen3

I don’t plan on updating the addon, it was merely for searching up Anime and viewing it just so I could get the gist of browser addon development and to see how my REST API worked with it. The conclusion was pretty nice.

The whole project is open-source and available on Github: https://github.com/irfan-dahir/anime-info

Designed with Material Design in mind and developed with speed in mind, it’s probably the only addon out there for its’ purpose. (No kidding, I couldn’t find anything remotely close). Sure, it can be updated to allow users to login and even update their Anime and Manga lists and even view and search Mangas – but again, nah.

Jikan API – Vision 2018 🎆 [Unofficial MyAnimeList API]

So it’s 2018 and Jikan is now 1 year old! MyAnimeList announce late 2017 that they’ll be working on fixing up their API but until then I’ll have Jikan running around. I have some plans for Jikan that need to be done, hopefully by mid-2018 or earlier, depending on college.

 

READ

 

There are some things I’m still interesting in scraping off of MAL, here’s the list.

 

User Profile

Taking an example of my own profile;

 

There’s a lot of data available per user profile. The best part here would be their favorite characters, people, anime, manga and basic stats. The hardest part to extract here would be the user based “About Me” which is highly customizable. So this, I might consider parsing since MAL’s HTML source is already terrible enough.

 

Top Anime/Manga/People/Characters

These pages give you access to a paginated list of anime/manga/people/characters ranked by their popularity/favoritism by the community from #1 to the last ranking available. Tis a gold mine entry.

 

Anime/Manga/Person/Character Search!

The official MAL API already has this feature but it only returns the first page of results! It only allows simple string queries and requires user authentication for the API call to work, which is what Jikan is meant to over come. This has been a requested feature, so I’ll most likely be working on a parser for this in the months to come.

 

 

Extended Data for Anime/Manga

This has been in the prospect of Jikan since the beginning, but I’ve held off any other extended parsing other than characters/staff and episodes until recently as I begun making scrapers for Pictures, Videos & News related to the item. This trend will continue as there are more pages that consist of interesting data regarding an anime or manga. Especially the reviews page since this has the best data for sentient analysis and averaging of any show or manga.

 

Will be focusing on these 4 for this year! It takes time to mine pure data since scraping HTML off MAL means a lot of weird and round-about ways of doing things!

DAY 5 – ‘Comments’ | DECEMBER WEB DESIGN CHALLENGE

I actually lost the challenge. Totally forgot to make a design on the 28th. This should’ve been Day 6, but ah well. In this case, what I’ll simply do is push it a day extra, so the challenge would end on the 3rd of Jan instead. I will continue the challenge.

So, this is a post/comment based component. Something you’d see on any social media. It does look a little similar to Facebook but that’s just your imagination. 😋

Demo | Download | In The Making Timelapse (Coming)

I can confidently say that with this one I’ve managed to combine the use of FlexBox and Grid and go up a level! 🧙‍♂️ It’s amazing how they both can work together.

Specification

DAY 4 – ‘Plans’ | DECEMBER WEB DESIGN CHALLENGE

I struck off the 24 hour limit as I’ve been insanely busy during the past 2 days. But I managed to complete this design before morning (6.30am) so I won’t be considering it a “next day” yet for the sake of not ruining my own challenge. 🤔

Demo | Download | In The Making Timelapse

 

This design, again, was done as fast as I could with little planning as I woke up at around 4.00am and begun at 4.30~. So it took me about an hour and half to complete this one.

I used FontAwesome 5 for the icons and picked out a gradient color from uigradients.com.

 

Specification

Colors used are:

  • Primary Gradient (#ff9966)
  • Secondary Gradient (#ff5e62)
  • White n’ Black

And as the last 2 designs, Open Sans was used for the type face.