Human Generated Data

Title

Untitled (children riding on toy train)

Date

c. 1951

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15610

Human Generated Data

Title

Untitled (children riding on toy train)

People

Artist: Jack Gould, American

Date

c. 1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15610

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Furniture 99.9
Person 96.6
Human 96.6
Person 96.2
Crib 96.1
Person 89.9
Person 88
Person 86.7
Person 79.9
Person 72.4

Clarifai
created on 2023-10-29

child 99.1
people 96.4
girl 93.2
woman 88.8
two 88
fun 87.1
retro 86.1
family 85.9
love 85.3
indoors 85.3
portrait 85.1
street 83.3
art 81.6
boy 81
sit 79.6
landscape 79.1
wear 79
outdoors 78.7
vector 78
son 77.3

Imagga
created on 2022-02-05

blackboard 75.7
container 51.4
envelope 46.7
carton 33.6
box 26.1
paper 25.2
grunge 23.9
billboard 22.8
vintage 22.3
structure 22.3
old 22.3
texture 20.2
retro 19.7
signboard 17.7
antique 17.3
classroom 16.7
aged 16.3
blank 16.3
frame 15.8
school 15.8
education 14.7
ancient 14.7
student 14.5
business 14
space 13.2
study 13.1
empty 12.9
money 12.8
currency 12.6
learn 12.3
wallpaper 12.3
cash 11.9
finance 11.8
chalkboard 11.8
people 11.7
material 11.6
college 11.4
board 11.3
wall 11.2
dirty 10.9
grime 10.7
person 10.7
decay 10.6
class 10.6
graphic 10.2
card 10.2
note 10.1
man 10.1
house 10
fracture 9.7
decoration 9.7
exam 9.6
university 9.6
happy 9.4
page 9.3
grain 9.2
art 9.1
border 9
bank 9
mottled 8.8
crumpled 8.7
drawing 8.7
parchment 8.6
notes 8.6
damaged 8.6
leaf 8.6
bill 8.6
male 8.5
flower 8.5
design 8.4
portrait 8.4
silhouette 8.3
message 8.2
teacher 8.2
book 8.2
backgrounds 8.1
text 7.9
color 7.8
floral 7.7
worn 7.6
horizontal 7.5
smart 7.5
pattern 7.5
greeting 7.4
new 7.3
group 7.3
adult 7.2
success 7.2
home 7.2
copy 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 94.6
human face 79.2
person 66.3
baby 52.6
old 49.7
posing 36.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Male, 95.9%
Calm 82.2%
Surprised 10.6%
Confused 2.3%
Fear 1.8%
Angry 1.2%
Disgusted 1.1%
Happy 0.4%
Sad 0.4%

AWS Rekognition

Age 2-8
Gender Female, 98.2%
Happy 81.5%
Calm 16.7%
Surprised 0.4%
Confused 0.4%
Angry 0.3%
Sad 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 10-18
Gender Female, 99.8%
Calm 99.7%
Angry 0.1%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Sad 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 1-7
Gender Female, 100%
Calm 90.5%
Surprised 2.4%
Happy 2.1%
Confused 2%
Sad 0.9%
Angry 0.8%
Disgusted 0.8%
Fear 0.6%

AWS Rekognition

Age 23-33
Gender Female, 99.4%
Happy 99.5%
Calm 0.2%
Surprised 0.1%
Sad 0%
Fear 0%
Angry 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 6-16
Gender Female, 99.9%
Happy 44.3%
Angry 19.8%
Calm 18.9%
Confused 10%
Disgusted 2.1%
Sad 1.8%
Surprised 1.7%
Fear 1.5%

AWS Rekognition

Age 54-64
Gender Female, 94.8%
Calm 96%
Happy 1.4%
Sad 0.7%
Surprised 0.6%
Fear 0.5%
Confused 0.4%
Angry 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person
Crib
Person 96.6%
Person 96.2%
Person 89.9%
Person 88%
Person 86.7%
Person 79.9%
Person 72.4%
Crib 96.1%

Captions

Microsoft
created on 2022-02-05

a baby posing for the camera 47.3%
a person holding a baby 34.7%
a baby posing for a photo 34.6%

Text analysis

Amazon

CITY
OF
LOUIS
CITY OF ST. LOUIS
ST.
LOUI
CITY OF ST. LOUI

Google

CITY OF ST. LOUIS CITY OF ST. LOU
CITY
OF
ST.
LOUIS
LOU