Human Generated Data

Title

Untitled (three children walking a baby carraige outside)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14758

Human Generated Data

Title

Untitled (three children walking a baby carraige outside)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14758

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 98.3
Apparel 98.3
Person 97.6
Human 97.6
Person 95.5
Person 88.1
Shorts 83.8
Furniture 81.5
Face 76.7
Floor 76.7
Chair 72.3
Grass 71.7
Plant 71.7
People 68.4
Wheel 66.8
Machine 66.8
Suit 65.3
Coat 65.3
Overcoat 65.3
Kid 62.3
Child 62.3
Photography 61.2
Photo 61.2
Baby 61.1
Outdoors 59.3
Path 58.8
Female 57.7
Dress 57.2
Indoors 55.7

Clarifai
created on 2023-10-27

people 99.9
adult 98.5
two 98.3
family 98
chair 98
canine 97.9
home 97.5
man 97.2
child 97
furniture 96.7
wedding 96.2
woman 96
room 95.9
dog 95.6
group together 95.2
group 94.5
monochrome 94.3
wear 94.1
bride 93.9
three 93.1

Imagga
created on 2022-01-29

wheeled vehicle 32.9
vehicle 21.1
shopping cart 20.9
mobile home 19.6
newspaper 19.4
man 18.1
handcart 17.5
male 17
trailer 16.6
housing 16.4
product 15.6
transport 15.5
structure 15.2
transportation 15.2
people 14.5
outdoors 14.4
conveyance 14.4
adult 13
city 12.5
creation 12
chair 11.9
container 11.5
urban 11.4
travel 11.3
person 11.2
old 11.1
sport 10.8
seat 10.7
sand 10.5
wheelchair 10.1
road 9.9
day 9.4
building 9.3
street 9.2
vacation 9
sky 8.9
lifestyle 8.7
architecture 8.6
industry 8.5
outdoor 8.4
summer 8.4
tricycle 8.3
speed 8.2
industrial 8.2
activity 8.1
holiday 7.9
couple 7.8
beach 7.7
men 7.7
walk 7.6
bridge 7.6
power 7.6
wheel 7.5
house 7.5
human 7.5
silhouette 7.4
window 7.3
sunset 7.2
recreation 7.2
work 7.1
sea 7
bicycle 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

black and white 95.1
text 88.6
clothing 86.4
person 80.8
man 79.9
sport 68.4
woman 55.1
house 52.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 84.3%
Calm 87.9%
Surprised 10%
Sad 0.9%
Angry 0.4%
Disgusted 0.3%
Happy 0.2%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Person 97.6%
Person 95.5%
Person 88.1%
Wheel 66.8%