Human Generated Data

Title

Untitled (nuns, Manchester, New Hampshire)

Date

1931, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.77

Human Generated Data

Title

Untitled (nuns, Manchester, New Hampshire)

People

Artist: Durette Studio, American 20th century

Date

1931, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.77

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 98.9
Apparel 98.9
Person 98.2
Human 98.2
Person 96.5
Person 96.2
Person 96
Person 96
Person 94.8
Person 93.5
Person 88
Person 85.9
Monitor 79.9
Electronics 79.9
Screen 79.9
Display 79.9
Person 74.9
People 73
Fashion 71.5
Face 68.2
Cloak 64.6
Overcoat 63.2
Coat 63.2
Sleeve 59.8
Evening Dress 59.4
Gown 59.4
Robe 59.4
Funeral 59

Clarifai
created on 2023-10-15

people 99
group 97.8
indoors 94.5
man 93.8
woman 93.7
portrait 92.8
education 91.4
adult 90.3
child 89.2
monochrome 88
school 86.8
nun 85.7
art 80.2
uniform 77.7
family 76.6
group together 76.4
wear 73.8
vector 73.7
wedding 73.4
outfit 72.6

Imagga
created on 2021-12-14

background 72.4
screen 71.7
display 54.7
television 32.6
electronic device 29.1
touch screen 28
web site 24.9
technology 24.5
device 24.1
electronic 21.5
black 20.4
digital 19.4
monitor 19.1
computer 19
equipment 18.9
design 18.6
telecommunication system 17.5
modern 16.1
web 15.2
business 15.2
button 15
communication 14.3
global 13.7
navigation 12.5
media 12.4
object 11.7
information 11.5
laptop 11.4
template 11
video 10.6
sign 10.5
office 10.4
website 10.4
frame 10
art 9.8
idea 9.8
layout 9.7
stereo 9.6
film 9.5
wireless 9.5
system 9.5
set 9.5
icon 9.5
graphic 9.5
symbol 9.4
number 9.3
space 9.3
banner 9.2
board 9
blackboard 9
close 8.6
site 8.4
electronic equipment 8.4
hand 8.4
page 8.3
data 8.2
music 8.1
school 8.1
navigator 7.9
classroom 7.8
blank 7.7
menu 7.6
notebook 7.6
mobile 7.5
study 7.5
presentation 7.4
home 7.2
financial 7.1
science 7.1
broadcasting 7

Google
created on 2021-12-14

Font 75.5
Event 70.8
Art 67
Door 66.5
Monochrome photography 65
Stock photography 63.7
Monochrome 63.5
Rectangle 63.3
Room 63.1
History 60.6
Crew 59.9
Official 58
Team 57.4
Uniform 57.3
Picture frame 56.5
Visual arts 53.2

Microsoft
created on 2021-12-14

text 99
clothing 94.4
person 85.8
black 77.3
dress 57.2
posing 48.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-56
Gender Male, 92.2%
Calm 97.3%
Happy 0.6%
Confused 0.6%
Sad 0.5%
Angry 0.5%
Surprised 0.3%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 31-47
Gender Male, 76.8%
Sad 35.2%
Calm 21.4%
Fear 17.3%
Angry 12.5%
Confused 6.1%
Happy 3.2%
Surprised 2.5%
Disgusted 1.7%

AWS Rekognition

Age 24-38
Gender Male, 70.8%
Calm 78.6%
Sad 11.6%
Confused 3.4%
Happy 2.4%
Angry 1.4%
Fear 1.3%
Surprised 0.7%
Disgusted 0.5%

AWS Rekognition

Age 37-55
Gender Male, 54.3%
Calm 64.2%
Sad 14.6%
Confused 12.2%
Angry 3.5%
Happy 2%
Surprised 1.5%
Fear 1.4%
Disgusted 0.7%

AWS Rekognition

Age 23-37
Gender Female, 64.6%
Calm 71.3%
Confused 9.8%
Fear 6.3%
Sad 4.2%
Happy 4.1%
Angry 2%
Surprised 1.6%
Disgusted 0.7%

AWS Rekognition

Age 23-35
Gender Male, 98.2%
Confused 64.7%
Sad 28%
Calm 3.1%
Fear 1.1%
Angry 1.1%
Disgusted 1%
Surprised 0.6%
Happy 0.4%

AWS Rekognition

Age 33-49
Gender Male, 58.5%
Sad 79.3%
Confused 9.4%
Angry 5.7%
Calm 3.7%
Disgusted 1%
Fear 0.5%
Happy 0.2%
Surprised 0.1%

AWS Rekognition

Age 35-51
Gender Male, 79.2%
Sad 86.4%
Calm 5.3%
Fear 4%
Confused 2.9%
Angry 0.7%
Surprised 0.4%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Male, 78.2%
Calm 37.9%
Confused 23.9%
Sad 9.7%
Fear 9.2%
Angry 8.6%
Surprised 6.1%
Happy 3.4%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Monitor 79.9%

Categories

Captions

Microsoft
created on 2021-12-14

graphical user interface, website 100%

Text analysis

Amazon

82%
101
New
Studio
Durette
George Durette Studio
George
1931
Hampshire
160
Manchester, New Hampshire
08
Manchester,
19305

Google

101 82% 08 160 George Durette Studio Manchester, New Hampshire E 1930s 193/
101
82%
08
160
George
Durette
Studio
Manchester,
New
Hampshire
E
1930s
193/