Help:BotsEdit This Page

From FamilySearch Wiki

(Difference between revisions)
(added standard PD header and Languages templates)
 
(Fix broken link)
 
(8 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{{PD Help Page}}
+
{{PD Help Page}}  
==Running your own bot==
+
*[[meta:Using the python wikipediabot]]
+
:''For what use?'' E.g. add a footer to some categorized pages,<ref>[[meta:add_text.py|add_text.py]]</ref> add some wikilinks,<ref>[[meta:replace.py|replace.py]]</ref> archive in moving old talk to subpages,<ref>[[meta:archivebot.py|archivebot.py]]</ref> edit categories,<ref>[[meta:category.py|category.py]]</ref> manage templates.<ref>[[meta:template.py|template.py]]</ref>
+
  
:::{| style="font-color:#535068; border:solid 0px #A8A8A8; background-color:#FFFFFF;font-size:75%;"
+
Bots, or Robots are software applications that perform automated tasks on the internet. Most often, they run tasks that are simple and structurally repetitive, much quicker than a human could do alone.  
|[[File:Crystal Clear action run.png|40px]]
+
|<tt><references/></tt>
+
|}
+
  
=== Frameworks and interfaces for bot development ===
+
<br>{{Contributor Help badge
:''See '''[[API:Client Code]]'''''
+
| link = https://familysearch.org/ask/
MediaWiki [[API]] is for convenient access to machine-readable data.
+
| name = Get Help}}<br><br><br>
  
== See also ==
+
== Policy  ==
* [[meta:Bot]]
+
  
{{Admin tip|tip=<br>
+
Personal bots provisions require prior consent from FamilySearch management.  
*[[Manual:User rights management]] for setting bot access.
+
*[[:mw:robots.txt|robots.txt]] tells external web robots how to index your site.
+
*[[Extension:MassEditRegex]] - to allow regular expression edits through a special page.
+
}}
+
  
{{Languages}}
+
== Running your own bot  ==
 +
 
 +
The&nbsp;[http://www.mediawiki.org/wiki/Manual:Pywikipediabot Python wikipediabot] is a collection of tools that automate work on MediaWiki sites
 +
 
 +
<br>
 +
 
 +
== Frameworks and interfaces for bot development  ==
 +
 
 +
:''See '''[http://www.mediawiki.org/wiki/API:Client_code API Client Code]'''''
 +
 
 +
MediaWiki [http://www.mediawiki.org/wiki/API API] is for convenient access to machine-readable data.
 +
 
 +
<br>
 +
 
 +
{{Admin tip|tip=&amp;lt;br&amp;gt;&lt;br&gt;
 +
*[http://www.mediawiki.org/wiki/Manual:User_rights Manual Manual:User rights management] for setting bot access.
 +
*[http://www.mediawiki.org/wiki/Robots.txt Robots.txt] tells external web robots how to index your site.
 +
*[http://www.mediawiki.org/wiki/Extension:MassEditRegex MassEditRegex] to allow regular expression edits through a special page.}}  
  
 
[[Category:Help|{{PAGENAME}}]]
 
[[Category:Help|{{PAGENAME}}]]

Latest revision as of 04:58, 1 August 2013

PD Important note: This page started out as a copy of one of the Public Domain Help Pages, which can be freely copied into fresh wiki installations and/or distributed with MediaWiki software; see mw:Help:Contents for an overview of all pages. See Project:PD help/Copying for instructions. PD


Bots, or Robots are software applications that perform automated tasks on the internet. Most often, they run tasks that are simple and structurally repetitive, much quicker than a human could do alone.


Help-content.png Questions?
Visit the Get Help to receive help with contributing to the Wiki.




Policy

Personal bots provisions require prior consent from FamilySearch management.

Running your own bot

The Python wikipediabot is a collection of tools that automate work on MediaWiki sites


Frameworks and interfaces for bot development

See API Client Code

MediaWiki API is for convenient access to machine-readable data.


Tools.png Tip for wiki admins: &lt;br&gt;<br>
  • This page was last modified on 1 August 2013, at 04:58.
  • This page has been accessed 817 times.