gnuastro-commits
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[gnuastro-commits] master ee4bf1e 069/113: Imported recent work from mas


From: Mohammad Akhlaghi
Subject: [gnuastro-commits] master ee4bf1e 069/113: Imported recent work from master, Segment's 3D kernels added
Date: Fri, 16 Apr 2021 10:33:49 -0400 (EDT)

branch: master
commit ee4bf1eea739613d878ff27b2d32e8ae67545527
Merge: 91f2d3e eaf24d3
Author: Mohammad Akhlaghi <mohammad@akhlaghi.org>
Commit: Mohammad Akhlaghi <mohammad@akhlaghi.org>

    Imported recent work from master, Segment's 3D kernels added
    
    There were no conflicts in the merge, but to make things compatible, we
    also need to fix a recent issue: Segment doesn't have the kernel headers in
    its `Makefile.am'.
---
 NEWS                         |   4 +-
 bin/noisechisel/Makefile.am  |   2 +-
 bin/segment/Makefile.am      |   3 +-
 bootstrap                    | 232 ++++++++++++++++++++++---------------------
 configure.ac                 |  11 +-
 doc/announce-acknowledge.txt |   1 +
 doc/gnuastro.texi            | 212 ++++++++++++++++++++++-----------------
 7 files changed, 251 insertions(+), 214 deletions(-)

diff --git a/NEWS b/NEWS
index 6d0eb6e..fafe7df 100644
--- a/NEWS
+++ b/NEWS
@@ -34,7 +34,7 @@ GNU Astronomy Utilities NEWS                          -*- 
outline -*-
     CPU usage. As a result, the input argument is no longer assumed to be
     the values file, but the object labels file. Please see the
     "MakeCatalog inputs and basic settings" section of the book for
-    more. Here is the summary of the the options:
+    more. Here is the summary of the new options:
     --insky: Sky value as a single value or file (dataset).
     --instd: Sky standard deviation as a single value or file (dataset).
     --valuesfile: filename containing the values dataset.
@@ -72,7 +72,7 @@ GNU Astronomy Utilities NEWS                          -*- 
outline -*-
     --ignoreblankinsky: similar to same option in NoiseChisel.
 
   Libraries:
-    gal_array_read: read array from any of known formats (FITS,TIFF,JPEG,...).
+    gal_array_read: read array from any of known formats (FITS, TIFF, 
JPEG,...).
     gal_array_read_to_type: similar to `gal_array_read', but to given type.
     gal_array_read_one_ch: Read a dataset, abort if it has multiple channels.
     gal_array_read_one_ch_to_type: Make sure input is in one channel and type.
diff --git a/bin/noisechisel/Makefile.am b/bin/noisechisel/Makefile.am
index 2428806..f16d69b 100644
--- a/bin/noisechisel/Makefile.am
+++ b/bin/noisechisel/Makefile.am
@@ -34,7 +34,7 @@ astnoisechisel_SOURCES = main.c ui.c detection.c 
noisechisel.c sky.c     \
   threshold.c
 
 EXTRA_DIST = main.h authors-cite.h args.h ui.h detection.h noisechisel.h \
-  sky.h threshold.h kernel-2d.h
+  sky.h threshold.h kernel-2d.h kernel-3d.h
 
 
 
diff --git a/bin/segment/Makefile.am b/bin/segment/Makefile.am
index 0ce6520..479192d 100644
--- a/bin/segment/Makefile.am
+++ b/bin/segment/Makefile.am
@@ -32,7 +32,8 @@ astsegment_LDADD = -lgnuastro
 
 astsegment_SOURCES = main.c ui.c segment.c clumps.c
 
-EXTRA_DIST = main.h authors-cite.h args.h ui.h segment.h clumps.h
+EXTRA_DIST = main.h authors-cite.h args.h ui.h segment.h clumps.h   \
+            kernel-2d.h kernel-3d.h
 
 
 
diff --git a/bootstrap b/bootstrap
index 5d3c289..eddacfb 100755
--- a/bootstrap
+++ b/bootstrap
@@ -1,10 +1,10 @@
 #! /bin/sh
 # Print a version string.
-scriptversion=2016-11-03.18; # UTC
+scriptversion=2018-04-28.14; # UTC
 
 # Bootstrap this package from checked-out sources.
 
-# Copyright (C) 2003-2016 Free Software Foundation, Inc.
+# Copyright (C) 2003-2018 Free Software Foundation, Inc.
 
 # This program is free software: you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -17,7 +17,7 @@ scriptversion=2016-11-03.18; # UTC
 # GNU General Public License for more details.
 
 # You should have received a copy of the GNU General Public License
-# along with this program.  If not, see <http://www.gnu.org/licenses/>.
+# along with this program.  If not, see <https://www.gnu.org/licenses/>.
 
 # Originally written by Paul Eggert.  The canonical version of this
 # script is maintained as build-aux/bootstrap in gnulib, however, to
@@ -109,9 +109,6 @@ die() { warn_ "$@"; exit 1; }
 
 # Configuration.
 
-# Name of the Makefile.am
-gnulib_mk=gnulib.mk
-
 # List of gnulib modules needed.
 gnulib_modules=
 
@@ -141,7 +138,7 @@ po_download_command_format=\
 # Fallback for downloading .po files (if rsync fails).
 po_download_command_format2=\
 "wget --mirror -nd -q -np -A.po -P '%s' \
- http://translationproject.org/latest/%s/";
+ https://translationproject.org/latest/%s/";
 
 # Prefer a non-empty tarname (4th argument of AC_INIT if given), else
 # fall back to the package name (1st argument with munging)
@@ -170,7 +167,15 @@ source_base=lib
 m4_base=m4
 doc_base=doc
 tests_base=tests
-gnulib_extra_files=''
+gnulib_extra_files="
+        build-aux/install-sh
+        build-aux/mdate-sh
+        build-aux/texinfo.tex
+        build-aux/depcomp
+        build-aux/config.guess
+        build-aux/config.sub
+        doc/INSTALL
+"
 
 # Additional gnulib-tool options to use.  Use "\newline" to break lines.
 gnulib_tool_option_extras=
@@ -264,24 +269,18 @@ case "$0" in
   *) test -r "$0.conf" && . ./"$0.conf" ;;
 esac
 
-# Extra files from gnulib, which override files from other sources.
-test -z "${gnulib_extra_files}" && \
-  gnulib_extra_files="
-        build-aux/install-sh
-        build-aux/mdate-sh
-        build-aux/texinfo.tex
-        build-aux/depcomp
-        build-aux/config.guess
-        build-aux/config.sub
-        doc/INSTALL
-"
-
 if test "$vc_ignore" = auto; then
   vc_ignore=
   test -d .git && vc_ignore=.gitignore
   test -d CVS && vc_ignore="$vc_ignore .cvsignore"
 fi
 
+if test x"$gnulib_modules$gnulib_files$gnulib_extra_files" = x; then
+  use_gnulib=false
+else
+  use_gnulib=true
+fi
+
 # Translate configuration into internal form.
 
 # Parse options.
@@ -612,85 +611,87 @@ git_modules_config () {
   test -f .gitmodules && git config --file .gitmodules "$@"
 }
 
-if $use_git; then
-  gnulib_path=$(git_modules_config submodule.gnulib.path)
-  test -z "$gnulib_path" && gnulib_path=gnulib
-fi
+if $use_gnulib; then
+  if $use_git; then
+    gnulib_path=$(git_modules_config submodule.gnulib.path)
+    test -z "$gnulib_path" && gnulib_path=gnulib
+  fi
 
-# Get gnulib files.  Populate $GNULIB_SRCDIR, possibly updating a
-# submodule, for use in the rest of the script.
+  # Get gnulib files.  Populate $GNULIB_SRCDIR, possibly updating a
+  # submodule, for use in the rest of the script.
 
-case ${GNULIB_SRCDIR--} in
--)
-  # Note that $use_git is necessarily true in this case.
-  if git_modules_config submodule.gnulib.url >/dev/null; then
-    echo "$0: getting gnulib files..."
-    git submodule init -- "$gnulib_path" || exit $?
-    git submodule update -- "$gnulib_path" || exit $?
+  case ${GNULIB_SRCDIR--} in
+  -)
+    # Note that $use_git is necessarily true in this case.
+    if git_modules_config submodule.gnulib.url >/dev/null; then
+      echo "$0: getting gnulib files..."
+      git submodule init -- "$gnulib_path" || exit $?
+      git submodule update -- "$gnulib_path" || exit $?
 
-  elif [ ! -d "$gnulib_path" ]; then
-    echo "$0: getting gnulib files..."
+    elif [ ! -d "$gnulib_path" ]; then
+      echo "$0: getting gnulib files..."
 
-    trap cleanup_gnulib 1 2 13 15
+      trap cleanup_gnulib 1 2 13 15
 
-    shallow=
-    git clone -h 2>&1 | grep -- --depth > /dev/null && shallow='--depth 2'
-    git clone $shallow git://git.sv.gnu.org/gnulib "$gnulib_path" ||
-      cleanup_gnulib
+      shallow=
+      git clone -h 2>&1 | grep -- --depth > /dev/null && shallow='--depth 2'
+      git clone $shallow git://git.sv.gnu.org/gnulib "$gnulib_path" ||
+        cleanup_gnulib
 
-    trap - 1 2 13 15
-  fi
-  GNULIB_SRCDIR=$gnulib_path
-  ;;
-*)
-  # Use GNULIB_SRCDIR directly or as a reference.
-  if $use_git && test -d "$GNULIB_SRCDIR"/.git && \
-        git_modules_config submodule.gnulib.url >/dev/null; then
-    echo "$0: getting gnulib files..."
-    if git submodule -h|grep -- --reference > /dev/null; then
-      # Prefer the one-liner available in git 1.6.4 or newer.
-      git submodule update --init --reference "$GNULIB_SRCDIR" \
-        "$gnulib_path" || exit $?
-    else
-      # This fallback allows at least git 1.5.5.
-      if test -f "$gnulib_path"/gnulib-tool; then
-        # Since file already exists, assume submodule init already complete.
-        git submodule update -- "$gnulib_path" || exit $?
+      trap - 1 2 13 15
+    fi
+    GNULIB_SRCDIR=$gnulib_path
+    ;;
+  *)
+    # Use GNULIB_SRCDIR directly or as a reference.
+    if $use_git && test -d "$GNULIB_SRCDIR"/.git && \
+          git_modules_config submodule.gnulib.url >/dev/null; then
+      echo "$0: getting gnulib files..."
+      if git submodule -h|grep -- --reference > /dev/null; then
+        # Prefer the one-liner available in git 1.6.4 or newer.
+        git submodule update --init --reference "$GNULIB_SRCDIR" \
+          "$gnulib_path" || exit $?
       else
-        # Older git can't clone into an empty directory.
-        rmdir "$gnulib_path" 2>/dev/null
-        git clone --reference "$GNULIB_SRCDIR" \
-          "$(git_modules_config submodule.gnulib.url)" "$gnulib_path" \
-          && git submodule init -- "$gnulib_path" \
-          && git submodule update -- "$gnulib_path" \
-          || exit $?
+        # This fallback allows at least git 1.5.5.
+        if test -f "$gnulib_path"/gnulib-tool; then
+          # Since file already exists, assume submodule init already complete.
+          git submodule update -- "$gnulib_path" || exit $?
+        else
+          # Older git can't clone into an empty directory.
+          rmdir "$gnulib_path" 2>/dev/null
+          git clone --reference "$GNULIB_SRCDIR" \
+            "$(git_modules_config submodule.gnulib.url)" "$gnulib_path" \
+            && git submodule init -- "$gnulib_path" \
+            && git submodule update -- "$gnulib_path" \
+            || exit $?
+        fi
       fi
+      GNULIB_SRCDIR=$gnulib_path
     fi
-    GNULIB_SRCDIR=$gnulib_path
-  fi
-  ;;
-esac
+    ;;
+  esac
 
-# $GNULIB_SRCDIR now points to the version of gnulib to use, and
-# we no longer need to use git or $gnulib_path below here.
+  # $GNULIB_SRCDIR now points to the version of gnulib to use, and
+  # we no longer need to use git or $gnulib_path below here.
+
+  if $bootstrap_sync; then
+    cmp -s "$0" "$GNULIB_SRCDIR/build-aux/bootstrap" || {
+      echo "$0: updating bootstrap and restarting..."
+      case $(sh -c 'echo "$1"' -- a) in
+        a) ignored=--;;
+        *) ignored=ignored;;
+      esac
+      exec sh -c \
+        'cp "$1" "$2" && shift && exec "${CONFIG_SHELL-/bin/sh}" "$@"' \
+        $ignored "$GNULIB_SRCDIR/build-aux/bootstrap" \
+        "$0" "$@" --no-bootstrap-sync
+    }
+  fi
 
-if $bootstrap_sync; then
-  cmp -s "$0" "$GNULIB_SRCDIR/build-aux/bootstrap" || {
-    echo "$0: updating bootstrap and restarting..."
-    case $(sh -c 'echo "$1"' -- a) in
-      a) ignored=--;;
-      *) ignored=ignored;;
-    esac
-    exec sh -c \
-      'cp "$1" "$2" && shift && exec "${CONFIG_SHELL-/bin/sh}" "$@"' \
-      $ignored "$GNULIB_SRCDIR/build-aux/bootstrap" \
-      "$0" "$@" --no-bootstrap-sync
-  }
+  gnulib_tool=$GNULIB_SRCDIR/gnulib-tool
+  <$gnulib_tool || exit $?
 fi
 
-gnulib_tool=$GNULIB_SRCDIR/gnulib-tool
-<$gnulib_tool || exit $?
-
 # Get translations.
 
 download_po_files() {
@@ -699,7 +700,7 @@ download_po_files() {
   echo "$me: getting translations into $subdir for $domain..."
   cmd=$(printf "$po_download_command_format" "$domain" "$subdir")
   eval "$cmd" && return
-  # Fallback to HTTP.
+  # Fallback to HTTPS.
   cmd=$(printf "$po_download_command_format2" "$subdir" "$domain")
   eval "$cmd"
 }
@@ -790,9 +791,9 @@ symlink_to_dir()
       # Leave any existing symlink alone, if it already points to the source,
       # so that broken build tools that care about symlink times
       # aren't confused into doing unnecessary builds.  Conversely, if the
-      # existing symlink's time stamp is older than the source, make it afresh,
+      # existing symlink's timestamp is older than the source, make it afresh,
       # so that broken tools aren't confused into skipping needed builds.  See
-      # <http://lists.gnu.org/archive/html/bug-gnulib/2011-05/msg00326.html>.
+      # <https://lists.gnu.org/r/bug-gnulib/2011-05/msg00326.html>.
       test -h "$dst" &&
       src_ls=$(ls -diL "$src" 2>/dev/null) && set $src_ls && src_i=$1 &&
       dst_ls=$(ls -diL "$dst" 2>/dev/null) && set $dst_ls && dst_i=$1 &&
@@ -898,32 +899,33 @@ fi
 
 # Import from gnulib.
 
-gnulib_tool_options="\
- --import\
- --no-changelog\
- --aux-dir $build_aux\
- --doc-base $doc_base\
- --lib $gnulib_name\
- --m4-base $m4_base/\
- --source-base $source_base/\
- --tests-base $tests_base\
- --local-dir $local_gl_dir\
- $gnulib_tool_option_extras\
-"
-if test $use_libtool = 1; then
-  case "$gnulib_tool_options " in
-    *' --libtool '*) ;;
-    *) gnulib_tool_options="$gnulib_tool_options --libtool" ;;
-  esac
-fi
-echo "$0: $gnulib_tool $gnulib_tool_options --import ..."
-$gnulib_tool $gnulib_tool_options --import $gnulib_modules \
-  || die "gnulib-tool failed"
+if $use_gnulib; then
+  gnulib_tool_options="\
+   --no-changelog\
+   --aux-dir $build_aux\
+   --doc-base $doc_base\
+   --lib $gnulib_name\
+   --m4-base $m4_base/\
+   --source-base $source_base/\
+   --tests-base $tests_base\
+   --local-dir $local_gl_dir\
+   $gnulib_tool_option_extras\
+  "
+  if test $use_libtool = 1; then
+    case "$gnulib_tool_options " in
+      *' --libtool '*) ;;
+      *) gnulib_tool_options="$gnulib_tool_options --libtool" ;;
+    esac
+  fi
+  echo "$0: $gnulib_tool $gnulib_tool_options --import ..."
+  $gnulib_tool $gnulib_tool_options --import $gnulib_modules \
+    || die "gnulib-tool failed"
 
-for file in $gnulib_files; do
-  symlink_to_dir "$GNULIB_SRCDIR" $file \
-    || die "failed to symlink $file"
-done
+  for file in $gnulib_files; do
+    symlink_to_dir "$GNULIB_SRCDIR" $file \
+      || die "failed to symlink $file"
+  done
+fi
 
 bootstrap_post_import_hook \
   || die "bootstrap_post_import_hook failed"
@@ -1020,7 +1022,7 @@ bootstrap_epilogue
 echo "$0: done.  Now you can run './configure'."
 
 # Local variables:
-# eval: (add-hook 'write-file-hooks 'time-stamp)
+# eval: (add-hook 'before-save-hook 'time-stamp)
 # time-stamp-start: "scriptversion="
 # time-stamp-format: "%:y-%02m-%02d.%02H"
 # time-stamp-time-zone: "UTC0"
diff --git a/configure.ac b/configure.ac
index 7b589c2..b9045e7 100644
--- a/configure.ac
+++ b/configure.ac
@@ -98,10 +98,17 @@ AC_SUBST(IN_CPPFLAGS, "$CPPFLAGS")
 
 
 # Generic compiler flags for all sub-directories.
+#
+# IMPORTANT note: we need to add the `./lib/' directory to `LDFLAGS' and
+# `CPPFLAGS', since all Gnuastro needs to use the library headers and link
+# with them. But we need to use the absolute address of `./lib/'. So we
+# need to use the Autoconf variables with an `abs_' prefix. Otherwise (if
+# we use `top_builddir' istead of `abs_top_builddir'), we are going to have
+# problems in special conditions (most specifically: in `make distcheck').
 CFLAGS="-Wall -O3 $CFLAGS"
 CXXFLAGS="-Wall -O3 $CXXFLAGS"
-LDFLAGS="-L\$(top_builddir)/lib $LDFLAGS"
-CPPFLAGS="-I\$(top_srcdir)/lib $CPPFLAGS"
+LDFLAGS="-L\$(abs_top_builddir)/lib $LDFLAGS"
+CPPFLAGS="-I\$(abs_top_srcdir)/lib $CPPFLAGS"
 
 
 
diff --git a/doc/announce-acknowledge.txt b/doc/announce-acknowledge.txt
index 1fddafe..68ec251 100644
--- a/doc/announce-acknowledge.txt
+++ b/doc/announce-acknowledge.txt
@@ -12,6 +12,7 @@ Ole Streicher
 Michel Tallon
 Juan C. Tello
 Éric Thiébaut
+David Valls-Gabaud
 Aaron Watkins
 Sara Yousefi Taemeh
 Johannes Zabl
diff --git a/doc/gnuastro.texi b/doc/gnuastro.texi
index 39e4e9b..87ca66c 100644
--- a/doc/gnuastro.texi
+++ b/doc/gnuastro.texi
@@ -3928,16 +3928,17 @@ deeper/shallower.
 large object:} As you saw above, the reason we chose this particular
 configuration for NoiseChisel to detect the wings of the M51 group was
 strongly influenced by this particular object in this particular
-image. When signal takes over such a large fraction of your dataset, you
-will need some manual checking, intervention, or customization, to make
-sure that it is successfully detected. In other words, to make sure that
-your noise measurements are least affected by the signal@footnote{In the
-future, we may add capabilities to optionally automate some of the choices
-made here, please join us in doing this if you are interested. However,
-given the many problems in existing ``smart'' solutions, such automatic
-changing of the configuration may cause more problems than they solve. So
-even when they are implemented, we would strongly recommend manual checks
-and intervention for a robust analysis.}.
+image. When low surface brightness signal takes over such a large fraction
+of your dataset (and you want to accurately detect/account for it), to make
+sure that it is successfully detected, you will need some manual checking,
+intervention, or customization. In other words, to make sure that your
+noise measurements are least affected by the signal@footnote{In the future,
+we may add capabilities to optionally automate some of the choices made
+here, please join us in doing this if you are interested. However, given
+the many problems in existing ``smart'' solutions, such automatic changing
+of the configuration may cause more problems than they solve. So even when
+they are implemented, we would strongly recommend manual checks and
+intervention for a robust analysis.}.
 @end cartouche
 
 To avoid typing all these options every time you run NoiseChisel on this
@@ -4002,13 +4003,16 @@ rm $1"_cat.fits" $1.reg
 @end example
 
 @noindent
-Finally, you just have to activate its executable flag with the command
-below. This will enable you to directly call the script as a command.
+Finally, you just have to activate the script's executable flag with the
+command below. This will enable you to directly/easily call the script as a
+command.
 
 @example
 $ chmod +x check-clumps.sh
 @end example
 
+@cindex AWK
+@cindex GNU AWK
 This script doesn't expect the @file{.fits} suffix of the input's filename
 as the first argument. Because the script produces intermediate files (a
 catalog and DS9 region file, which are later deleted). However, we don't
@@ -4016,9 +4020,10 @@ want multiple instances of the script (on different 
files in the same
 directory) to collide (read/write to the same intermediate
 files). Therefore, we have used suffixes added to the input's name to
 identify the intermediate files. Note how all the @code{$1} instances in
-the commands (not within the AWK command where @code{$1} refers to the
-first column) are followed by a suffix. If you want to keep the
-intermediate files, put a @code{#} at the start of the last line.
+the commands (not within the AWK command@footnote{In AWK, @code{$1} refers
+to the first column, while in the shell script, it refers to the first
+argument.}) are followed by a suffix. If you want to keep the intermediate
+files, put a @code{#} at the start of the last line.
 
 The few, but high-valued, bright pixels in the central parts of the
 galaxies can hinder easy visual inspection of the fainter parts of the
@@ -4040,8 +4045,8 @@ Go ahead and run this command. You will see the 
intermediate processing
 being done and finally it opens SAO DS9 for you with the regions
 superimposed on all the extensions of Segment's output. The script will
 only finish (and give you control of the command-line) when you close
-DS9. If you need your access to the command-line before closing DS9, you
-can add a @code{&} after the end of the command above.
+DS9. If you need your access to the command-line before closing DS9, add a
+@code{&} after the end of the command above.
 
 @cindex Purity
 @cindex Completeness
@@ -4065,15 +4070,15 @@ best purity, you have to sacrifice completeness and 
vice versa.
 
 One interesting region to inspect in this image is the many bright peaks
 around the central parts of M51. Zoom into that region and inspect how many
-of them have actually been detected as true clumps, do you have a good
+of them have actually been detected as true clumps. Do you have a good
 balance between completeness and purity? Also look out far into the wings
 of the group and inspect the completeness and purity there.
 
 An easer way to inspect completness (and only completeness) is to mask all
-the pixels detected as clumps and see what is left over. You can do this
-with a command like below. For easy reading of the command, we'll define
-the shell variable @code{i} for the image name and save the output in
-@file{masked.fits}.
+the pixels detected as clumps and visually inspecting the rest of the
+pixels. You can do this using Arithmetic in a command like below. For easy
+reading of the command, we'll define the shell variable @code{i} for the
+image name and save the output in @file{masked.fits}.
 
 @example
 $ i=r_detected_segmented.fits
@@ -4083,17 +4088,17 @@ $ astarithmetic $i $i 0 gt nan where -hINPUT -hCLUMPS 
-omasked.fits
 Inspecting @file{masked.fits}, you can see some very diffuse peaks that
 have been missed, especially as you go farther away from the group center
 and into the diffuse wings. This is due to the fact that with this
-configuration we have focused more on the sharper clumps. To put the focus
-more on diffuse clumps, can use a wider convolution kernel. Using a larger
-kernel can also help in detecting larger clumps (thus better separating
-them from the underlying signal).
+configuration, we have focused more on the sharper clumps. To put the focus
+more on diffuse clumps, you can use a wider convolution kernel. Using a
+larger kernel can also help in detecting the existing clumps to fainter
+levels (thus better separating them from the surrounding diffuse signal).
 
 You can make any kernel easily using the @option{--kernel} option in
 @ref{MakeProfiles}. But note that a larger kernel is also going to wash-out
 many of the sharp/small clumps close to the center of M51 and also some
 smaller peaks on the wings. Please continue playing with Segment's
 configuration to obtain a more complete result (while keeping reasonable
-purity). We'll finish the discussion on finding true clumps here.
+purity). We'll finish the discussion on finding true clumps at this point.
 
 The properties of the background objects can then easily be measured using
 @ref{MakeCatalog}. To measure the properties of the background objects
@@ -4101,16 +4106,19 @@ The properties of the background objects can then 
easily be measured using
 diffuse region. When measuing clump properties with @ref{MakeCatalog}, the
 ambient flux (from the diffuse region) is calculated and subtracted. If the
 diffuse region is masked, its effect on the clump brightness cannot be
-calculated and subtracted. But to keep this tutorial short, we'll stop
-here. See @ref{General program usage tutorial} and @ref{Segment} for more
-on Segment, producing catalogs with MakeCatalog and using those catalogs.
+calculated and subtracted.
+
+To keep this tutorial short, we'll stop here. See @ref{General program
+usage tutorial} and @ref{Segment} for more on using Segment, producing
+catalogs with MakeCatalog and using those catalogs.
 
 Finally, if this book or any of the programs in Gnuastro have been useful
 for your research, please cite the respective papers and share your
 thoughts and suggestions with us (it can be very encouraging). All Gnuastro
 programs have a @option{--cite} option to help you cite the authors' work
 more easily. Just note that it may be necessary to cite additional papers
-for different programs, so please try it out for any program you use.
+for different programs, so please use @option{--cite} with any program that
+has been useful in your work.
 
 @example
 $ astmkcatalog --cite
@@ -15616,17 +15624,16 @@ configuration options.
 @node Segment changes after publication, Invoking astsegment, Segment, Segment
 @subsection Segment changes after publication
 
-Segment's main algorithm and working strategy was initially defined and
+Segment's main algorithm and working strategy were initially defined and
 introduced in Section 3.2 of @url{https://arxiv.org/abs/1505.01664,
-Akhlaghi and Ichikawa [2015]}. At that time it was part of
-@ref{NoiseChisel}, NoiseChisel's detection program@footnote{Until Gnuastro
-version 0.6 (May 2018), NoiseChisel was in charge of detection @emph{and}
-segmentation. For increased creativity and modularity, NoiseChisel's
-segmentation features were spun-off into separate program (Segment).}. It
-is strongly recommended to read this paper for a good understanding of what
-Segment does and how each parameter influences the output. To help in
-understanding how Segment works, that paper has a large number of figures
-showing every step on multiple mock and real examples.
+Akhlaghi and Ichikawa [2015]}. Prior to Gnuastro version 0.6 (released
+2018), one program (NoiseChisel) was in charge of detection @emph{and}
+segmentation. to increase creativity and modularity, NoiseChisel's
+segmentation features were spun-off into a separate program (Segment). It
+is strongly recommended to read that paper for a good understanding of what
+Segment does, how it relates to detection, and how each parameter
+influences the output. That paper has a large number of figures showing
+every step on multiple mock and real examples.
 
 However, the paper cannot be updated anymore, but Segment has evolved (and
 will continue to do so): better algorithms or steps have been (and will be)
@@ -15635,7 +15642,7 @@ of this section is to make the transition from the 
paper to your installed
 version, as smooth as possible through the list below. For a more detailed
 list of changes in previous Gnuastro releases/versions, please follow the
 @file{NEWS} file@footnote{The @file{NEWS} file is present in the released
-Gnuastro tarball, see @ref{Release tarball}}.
+Gnuastro tarball, see @ref{Release tarball}.}.
 
 @itemize
 
@@ -15646,7 +15653,7 @@ slightly less than NoiseChisel's default kernel (which 
has a FWHM of 2
 pixels). This enables the better detection of sharp clumps: as the kernel
 gets wider, the lower signal-to-noise (but sharp/small) clumps will be
 washed away into the noise. You can use MakeProfiles to build your own
-kernel if this is too sharp/wide for your purpose, see the
+kernel if this is too sharp/wide for your purpose. For more, see the
 @option{--kernel} option in @ref{Segment input}.
 
 The ability to use a different convolution kernel for detection and
@@ -15661,37 +15668,48 @@ ratio. This value is calculated from a clump's peak 
value (@mymath{C_c})
 and the highest valued river pixel around that clump (@mymath{R_c}). Both
 are calculated on the convolved image (signified by the @mymath{c}
 subscript). To avoid absolute differences, it is then divided by the input
-Sky standard deviation under that clump @mymath{\sigma} as shown below.
+(not convolved) Sky standard deviation under that clump (@mymath{\sigma})
+as shown below.
 
 @dispmath{C_c-R_c\over \sigma}
 
 The input Sky standard deviation dataset (@option{--std}) is assumed to be
 for the unconvolved image. Therefore a constant factor (related to the
 convolution kernel) is necessary to convert this into an absolute peak
-signal-to-noise ratio@footnote{You can mask all detections on the convolved
-image with @ref{Arithmetic}, then calculate the standard deviation of the
-(masked) convolved with the @option{--sky} option of @ref{Statistics} and
-compare values on the same tile with NoiseChisel's output.}. But as far as
-Segment is concerned, this absolute value is irrelevant: because it uses
-the ambient noise (undetected regions) to find the numerical threshold of
-this fraction and applies that over the detected regions.
-
-The convolved image has much less scatter, and the peak (maximum when
-@option{--minima} is not called) value of a distribution is strongly
-affected by scatter. Therefore the @mymath{C_c-R_c} is a more reliable
-(with less scatter) measure to identify signal than @mymath{C-R} (on the
-un-convolved image).
+signal-to-noise ratio@footnote{To get an estimate of the standard deviation
+correction factor between the input and convolved images, you can take the
+following steps: 1) Mask (set to NaN) all detections on the convolved image
+with the @code{where} operator or @ref{Arithmetic}. 2) Calculate the
+standard deviation of the undetected (non-masked) pixels of the convolved
+image with the @option{--sky} option of @ref{Statistics} (which also
+calculates the Sky standard deviation). Just make sure the tessellation
+settings of Statistics and NoiseChisel are the same (you can check with the
+@option{-P} option). 3) Divide the two standard deviation datasets to get
+the correction factor.}. As far as Segment is concerned, the absolute value
+of this correction factor is irrelevant: because it uses the ambient noise
+(undetected regions) to find the numerical threshold of this fraction and
+applies that over the detected regions.
+
+A distribution's extremum (maximum or minimum) values, used in the new
+critera, are strongly affected by scatter. On the other hand, the convolved
+image has much less scatter@footnote{For more on the effect of convolution
+on a distribution, see Section 3.1.1 of
+@url{https://arxiv.org/abs/1505.01664, Akhlaghi and Ichikawa
+[2015]}.}. Therefore @mymath{C_c-R_c} is a more reliable (with less
+scatter) measure to identify signal than @mymath{C-R} (on the un-convolved
+image).
 
 Initially, the total clump signal-to-noise ratio of each clump was used,
 see Section 3.2.1 of @url{https://arxiv.org/abs/1505.01664, Akhlaghi and
 Ichikawa [2015]}. Therefore its completeness decreased dramatically when
 clumps were present on gradients. In tests, this measure proved to be more
-successful in detecting clumps on gradients and on flatter regions.
+successful in detecting clumps on gradients and on flatter regions
+simultaneously.
 
 @item
 With the new @option{--minima} option, it is now possible to detect inverse
-clumps (for example absorption features), where the clump building should
-begin from its smallest value.
+clumps (for example absorption features). In such cases, the clump should
+be built from its smallest value.
 @end itemize
 
 
@@ -28289,48 +28307,53 @@ been updated. The following procedure can be a good 
suggestion to take when
 you have a new idea and are about to start implementing it.
 
 The steps below are not a requirement, the important thing is that when you
-send the program to be included in Gnuastro, the book and the code have to
-both be fully up-to-date and compatible and the purpose should be very
-clearly explained. You can follow any path you choose to do this, the
-following path was what we have found to be most successful until now.
+send your work to be included in Gnuastro, the book and the code have to
+both be fully up-to-date and compatible, with the purpose of the update
+very clearly explained. You can follow any strategy you like, the following
+strategy was what we have found to be most useful until now.
 
 @enumerate
 @item
-Edit the book and fully explain your desired change, such that your
-idea is completely embedded in the general context of the book with
-no sense of discontinuity for a first time reader. This will allow you
-to plan the idea much more accurately and in the general context of
-Gnuastro or a particular program. Later on, when you are coding, this
-general context will significantly help you as a road-map.
-
-A very important part of this process is the program introduction, which
-explains the purposes of the program. Before actually starting to code,
-explain your idea's purpose thoroughly in the start of the program section
-you wish to add or edit. While actually writing its purpose for a new
-reader, you will probably get some very valuable ideas that you hadn't
+Edit the book and fully explain your desired change, such that your idea is
+completely embedded in the general context of the book with no sense of
+discontinuity for a first time reader. This will allow you to plan the idea
+much more accurately and in the general context of Gnuastro (a particular
+program or library). Later on, when you are coding, this general context
+will significantly help you as a road-map.
+
+A very important part of this process is the program/library introduction.
+These first few paragraphs explain the purposes of the program or libirary
+and are fundamental to Gnuastro. Before actually starting to code, explain
+your idea's purpose thoroughly in the start of the respective/new section
+you wish to work on. While actually writing its purpose for a new reader,
+you will probably get some valuable and interesting ideas that you hadn't
 thought of before. This has occurred several times during the creation of
-Gnuastro. If an introduction already exists, embed or blend your idea's
-purpose with the existing purposes. We emphasize that doing this is equally
-useful for you (as the programmer) as it is useful for the user
-(reader). Recall that the purpose of a program is very important, see
-@ref{Program design philosophy}.
-
-As you have already noticed for every program, it is very important that
-the basics of the science and technique be explained in separate
+Gnuastro.
+
+If an introduction already exists, embed or blend your idea's purpose with
+the existing introduction. We emphasize that doing this is equally useful
+for you (as the programmer) as it is useful for the user (reader). Recall
+that the purpose of a program is very important, see @ref{Program design
+philosophy}.
+
+As you have already noticed for every program/library, it is very important
+that the basics of the science and technique be explained in separate
 subsections prior to the `Invoking Programname' subsection. If you are
 writing a new program or your addition to an existing program involves a
 new concept, also include such subsections and explain the concepts so a
 person completely unfamiliar with the concepts can get a general initial
 understanding. You don't have to go deep into the details, just enough to
-get an interested person (with absolutely no background) started. If you
-feel you can't do that, then you have probably not understood the concept
-yourself! If you feel you don't have the time, then think about yourself as
-the reader in one year: you will forget almost all the details, so now that
-you have done all the theoretical preparations, add a few more hours and
-document it, so next time you don't have to prepare as much. Have in mind
-that your only limitation in length is the fatigue of the reader after
-reading a long text, nothing else. So as long as you keep it
-relevant/interesting for the reader, there is no page number limit/cost!
+get an interested person (with absolutely no background) started with some
+good pointers/links to where they can continue studying if they are more
+interested. If you feel you can't do that, then you have probably not
+understood the concept yourself. If you feel you don't have the time, then
+think about yourself as the reader in one year: you will forget almost all
+the details, so now that you have done all the theoretical preparations,
+add a few more hours and document it. Therefore in one year, when you find
+a bug or want to add a new feature, you don't have to prepare as much. Have
+in mind that your only limitation in length is the fatigue of the reader
+after reading a long text, nothing else. So as long as you keep it
+relevant/interesting for the reader, there is no page number limit/cost.
 
 It might also help if you start discussing the usage of your idea in the
 `Invoking ProgramName' subsection (explaining the options and arguments you
@@ -28350,6 +28373,9 @@ After your work has been fully implemented, read the 
section documentation
 from the start and see if you didn't miss any change in the coding and to
 see if the context is fairly continuous for a first time reader (who hasn't
 seen the book or had known Gnuastro before you made your change).
+
+@item
+If the change is notable, also update the @file{NEWS} file.
 @end enumerate
 
 



reply via email to

[Prev in Thread] Current Thread [Next in Thread]