Human Generated Data

Title

Untitled (full length portrait, man seated on left, woman standing on right))

Date

1866-1899

People

Artist: Fenton and Andruss, American active mid-19th to early-20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4042

Human Generated Data

Title

Untitled (full length portrait, man seated on left, woman standing on right))

People

Artist: Fenton and Andruss, American active mid-19th to early-20th century

Date

1866-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4042

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Person 99.3
Human 99.3
Person 99.3
Clothing 91.1
Apparel 91.1
People 86.9
Performer 77.1
Coat 64.6
Photography 64.3
Photo 64.3
Face 63.2
Portrait 63.2
Art 61.8
Overcoat 57.4
Advertisement 55.5
Poster 55.5

Clarifai
created on 2019-11-10

people 99.6
portrait 97.7
man 97
adult 96.1
woman 95.7
art 94.3
vintage 94
retro 93
wear 90.2
music 85.3
old 85.3
girl 84.7
antique 83.1
one 81
child 80.7
illustration 79.3
desktop 78.1
two 73.7
couple 72.7
actor 69.6

Imagga
created on 2019-11-10

kin 69.2
statue 32.6
sculpture 25.1
religion 23.3
art 21.9
ancient 18.2
old 17.4
architecture 17.2
stone 17.1
book jacket 16.8
people 16.2
monument 14.9
portrait 14.9
man 14.2
love 14.2
jacket 14.1
history 13.4
ruler 13.4
person 13.3
religious 13.1
culture 12.8
dress 12.6
tourism 12.4
male 12.1
vintage 11.6
god 11.5
lady 11.4
travel 11.3
antique 11.3
building 11.1
decoration 11
historic 11
adult 11
model 10.9
marble 10.8
romantic 10.7
face 10.7
attractive 10.5
sexy 10.4
historical 10.4
famous 10.2
sand 10.2
figure 10
wrapping 9.9
catholic 9.7
sepia 9.7
detail 9.7
holy 9.6
couple 9.6
groom 8.7
spirituality 8.6
earth 8.6
soil 8.4
fashion 8.3
bride 8.1
temple 7.9
happiness 7.8
pray 7.8
golden 7.7
covering 7.7
grunge 7.7
outdoor 7.6
decorative 7.5
traditional 7.5
city 7.5
church 7.4
artwork 7.3
world 7.3
body 7.2
museum 7

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

text 99.6
clothing 98.9
person 98.1
man 93.6
vintage clothing 80
old 75.6
human face 73
dress 70.6
retro style 70.5
photograph 70
smile 68.1
posing 66.1
woman 56.3
picture frame 8.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Male, 99.3%
Confused 0.5%
Disgusted 0%
Surprised 0%
Happy 0.2%
Sad 0.2%
Angry 0%
Fear 0%
Calm 99.1%

AWS Rekognition

Age 11-21
Gender Female, 85.7%
Calm 25.7%
Fear 0.2%
Sad 70.9%
Surprised 0.3%
Angry 1.4%
Disgusted 0.1%
Happy 0.3%
Confused 1%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Poster 55.5%

Categories

Captions

Text analysis

Amazon

N.
N. Y.
edre
Y.
y
C y edre
C

Google

JAMESTOWN, N. Y.
JAMESTOWN,
N.
Y.