Human Generated Data

Title

Untitled (children playing dress-up and walking baby carraige on sidewalk)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15593.1

Human Generated Data

Title

Untitled (children playing dress-up and walking baby carraige on sidewalk)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15593.1

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.8
Human 98.8
Person 98.8
Person 98.2
Shelter 88.6
Nature 88.6
Outdoors 88.6
Rural 88.6
Building 88.6
Countryside 88.6
Face 80.6
Furniture 76.1
Urban 64.5
Portrait 63
Photography 63
Photo 63
Clothing 61.3
Apparel 61.3
Housing 61
People 59.8
Neighborhood 55.9

Clarifai
created on 2023-10-29

people 99.9
two 99.4
adult 98.8
vehicle 98.8
woman 96.8
home 96.8
child 96.2
man 95.5
family 95.4
chair 95.1
three 95.1
group 94.6
stroller 93.6
carriage 92.6
leader 91.7
transportation system 90.9
recreation 89.6
nostalgia 88.8
offspring 87.6
four 87.3

Imagga
created on 2022-02-05

man 27.5
sword 23.4
weapon 21.1
male 20
people 18.4
sport 18.2
building 16.4
athlete 15.1
newspaper 15
player 14.5
person 13.8
ballplayer 13.5
travel 12
street 12
outdoors 11.9
architecture 11.7
product 11.6
adult 11.4
men 11.2
city 10.8
wall 10.4
portrait 10.3
sports equipment 10.1
wheeled vehicle 9.9
school 9.8
equipment 9.6
tricycle 9.5
walking 9.5
contestant 9.4
old 9.1
park 9.1
black 9
creation 9
sky 8.9
statue 8.7
business 8.5
art 8.5
outdoor 8.4
competition 8.2
exercise 8.2
dress 8.1
religion 8.1
world 7.9
urban 7.9
couple 7.8
ancient 7.8
walk 7.6
stick 7.5
structure 7.5
leisure 7.5
cricket equipment 7.4
activity 7.2

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

outdoor 93.7
text 92.9
house 86.3
black and white 78.8
person 74.6
child 74.2
clothing 73.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 83.3%
Sad 67.8%
Calm 26.9%
Happy 1.7%
Confused 0.9%
Surprised 0.9%
Disgusted 0.8%
Angry 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 98.8%
Person 98.2%

Categories