Human Generated Data

Title

Untitled (children riding on toy train)

Date

c. 1951

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15613

Human Generated Data

Title

Untitled (children riding on toy train)

People

Artist: Jack Gould, American

Date

c. 1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Furniture 100
Person 97.3
Human 97.3
Crib 97
Person 95.5
Person 93.6
Person 91.7

Imagga
created on 2022-02-05

carton 48.5
currency 45.8
container 45.2
box 45
money 42.5
cash 36.6
paper 34
finance 32.9
banking 30.3
bank 29.5
business 29.1
baby bed 28.1
wealth 27.8
bill 27.6
savings 27
note 24.8
financial 24
furniture 23.9
crib 23
exchange 22.9
investment 22
dollar 21.3
banknote 20.4
hundred 20.3
envelope 20.2
bills 18.5
dollars 18.3
us 17.3
notes 17.3
close 15.4
furnishing 14.8
economy 14.8
book 14.6
pay 14.4
rate 12.6
cradle 12.6
child 11.8
blank 11.1
paying 10.7
payment 10.6
change 10.6
loan 10.5
one 10.4
rich 10.2
closeup 10.1
banknotes 9.8
monetary 9.8
success 9.7
product 9.6
reading 9.5
save 9.5
buy 9.4
stock 9.4
letter 9.2
market 8.9
happy 8.8
symbol 8.7
union 8.7
used 8.6
newspaper 8.5
old 8.4
office 8.1
covering 8.1
binder 8
concepts 8
funds 7.8
books 7.7
debt 7.7
profit 7.7
coin 7.6
pile 7.5
commerce 7.5
study 7.5
number 7.5
school 7.4
page 7.4
creation 7.4
object 7.3
message 7.3
book jacket 7.3
people 7.2
home 7.2
travel 7

Google
created on 2022-02-05

Photograph 94.2
Motor vehicle 90
Smile 88
Font 81.6
Toddler 75.9
Baby 75.2
Rolling stock 74.6
Vintage clothing 74.6
Train 74.4
Snapshot 74.3
Classic 73.9
Suit 72.9
Art 70.2
Child 70
Photo caption 67.3
Illustration 66.9
Room 64.9
Advertising 64.7
Fun 64.5
Stock photography 63.7

Microsoft
created on 2022-02-05

baby 98.7
toddler 96.5
human face 95.1
indoor 94.9
text 93.8
child 92.5
person 90.6
clothing 78.8
posing 42.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 99.9%
Calm 98.7%
Sad 0.8%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 4-10
Gender Male, 95.7%
Happy 97.4%
Surprised 0.9%
Calm 0.6%
Confused 0.5%
Disgusted 0.2%
Angry 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 6-16
Gender Female, 100%
Calm 60.8%
Surprised 10.7%
Happy 8.3%
Confused 7%
Angry 5.4%
Sad 3%
Fear 2.8%
Disgusted 2.1%

AWS Rekognition

Age 2-10
Gender Female, 100%
Fear 65.4%
Surprised 16.5%
Calm 14.2%
Angry 1.2%
Sad 0.9%
Happy 0.7%
Confused 0.6%
Disgusted 0.6%

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%
Crib 97%

Captions

Microsoft

a group of people posing for a photo 75.6%
a group of people posing for the camera 75.5%
a baby posing for the camera 47.1%

Text analysis

Amazon

CITY
ST.
LOUIS
CITY OF ST. LOUIS
OF

Google

1DUS
ST.
CITY OF ST. 1DUS
CITY
OF