X-Git-Url: https://cgit.sxemacs.org/?a=blobdiff_plain;f=todo;h=73c54bc1d7d158264765ef5a3a4a7d10dfa2bd38;hb=6448d6cb1d4ba0bc592b81b72a94215ce0aea338;hp=b12c524260002502c3624e67cb583020700e2f11;hpb=a0bc5441b8ef5292565c8022d78660074f24fdf1;p=gnus diff --git a/todo b/todo index b12c52426..73c54bc1d 100644 --- a/todo +++ b/todo @@ -94,10 +94,6 @@ Thanks for Micha Wiedenmann for this suggestion. -* Parsing of the common list confirmation requests so that Gnus can - prepare the response with a single command. Including LISTSERV - periodic ping messages and the like. - * Parsing of the subscription notice to stash away details like what address you're subscribed to the list under (and automatically send mail to the list using that address, when you send mail inside the list @@ -115,13 +111,13 @@ X-Loop, and all of the other random headers that often work would be very cool. -* Support for zipped folders for all backends this makes sense for. - Most likely using jka-compr. (It has been suggested that this do - work but I think it should be verified for all backends.) +* Agent: -* Agent (Can someone write some subtopics here? I don't use it myself - so I don't know what is lacking.) +* A better interface to the agent download scoring rules, like the one + for the other scoring rules. +* Editing of messages in the agents cache. + * Support for encrypted folders. Even if the mail arrives unencrypted Gnus should be able to encrypt the *folder* for added safety. This should go for both Gnus' own folders and the folders Gnus reads from @@ -136,10 +132,6 @@ locations (e.g. work and home) and want to have the same configuration. -* gnus-uu-decode should complain if one or more parts of a series post - (ie, "part N of X") is missing, and optionally tick what parts are - there for decoding in a later session. - * Additional article marking, and an ability to affect marks placed during e.g. mail acquisition. I want to be able to notice the subject "fast money" or "web traffic", automatically mark it with a @@ -184,12 +176,6 @@ [Probably `assistant.el' will provide this. But it's development is stalled.] -* Full integration of nnir into Gnus. Generic hooks for adding new - external nnir sources. I use a couple experimental, in-house tools - (JPRC is a research lab, occupied with document analysis and machine - learning) and adding new search engines to nnir by hacking the main - nnir.el module is rather clunky. - * Manual ordering of articles in an nnml folder. That is, keystrokes to move articles (or whole threads) up or down @@ -218,11 +204,6 @@ - meanwhile, we should still be able to associate certain mail sources with certain backends. -* A better interface to the agent download scoring rules, like the one - for the other scoring rules. - -* Editing of messages in the agents cache. - * More article marks (like '!' or '?'). Maybe user defined marks that can be displayed as any choosen charakter, so one could do things like limiting on, to do whatever one likes with @@ -1190,9 +1171,6 @@ gnus-killed-list: * gnus-(group,summary)-highlight should respect any `face' text props set on the lines. -* use run-with-idle-timer for gnus-demon instead of the home-brewed - stuff for better reliability. - * nndraft-request-group should tally auto-save files. * implement nntp-retry-on-break and nntp-command-timeout. @@ -1202,7 +1180,10 @@ gnus-killed-list: * nn*-spool-methods -* interrupitng agent fetching of articles should save articles. +* Interrupting the agent fetching of articles should save articles. + Have the Agent write out articles, one by one, as it retrieves + them, to avoid having to re-fetch them all if Emacs should crash + while fetching. * command to open a digest group, and copy all the articles there to the current group. @@ -1318,10 +1299,6 @@ gnus-killed-list: * Remove list identifiers from the subject in the summary when doing `^' and the like. -* Have the Agent write out articles, one by one, as it retrieves - them, to avoid having to re-fetch them all if Emacs should crash - while fetching. - * nnweb should include the "get whole article" article when getting articles.