Human Generated Data

Title

Untitled (Three women standing with goat)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15093

Human Generated Data

Title

Untitled (Three women standing with goat)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15093

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.5
Apparel 99.5
Person 99
Human 99
Person 98.9
Female 91.2
Car 88.5
Transportation 88.5
Vehicle 88.5
Automobile 88.5
Person 87.6
Advertisement 83.4
Poster 81.6
Face 80
Woman 78.6
Wheel 77.2
Machine 77.2
People 76.2
Collage 74.5
Suit 74.3
Overcoat 74.3
Coat 74.3
Shorts 73.8
Dress 68.6
Portrait 68
Photography 68
Photo 68
Skirt 65.5
Pants 65.3
Girl 59.1
Standing 55.4
Drawing 55
Art 55

Clarifai
created on 2023-10-29

people 99.9
group 97.3
man 96.8
adult 96.8
three 94.2
elderly 92.5
woman 92.5
wear 91.8
two 91.6
administration 91.3
leader 90.5
child 89.8
medical practitioner 88.7
four 88.5
group together 85.9
offspring 83.6
cavalry 82.7
family 78.5
home 76.7
print 76.2

Imagga
created on 2022-03-05

kin 30
man 24.9
people 24
person 18.1
male 17.9
couple 15.7
walking 15.2
adult 14.8
outdoor 13.8
outdoors 13.4
family 13.3
men 12.9
beach 12.6
parent 12.6
sunset 12.6
old 12.5
silhouette 12.4
businessman 12.4
mother 12.2
business 12.1
love 11
two 11
child 11
summer 10.9
sport 10.9
black 10.2
happy 9.4
water 9.3
suit 9.1
ocean 9.1
sand 9
active 9
religion 9
world 8.8
women 8.7
lifestyle 8.7
happiness 8.6
holiday 8.6
travel 8.4
sky 8.3
vacation 8.2
groom 8.1
sunlight 8
life 7.9
together 7.9
cleaner 7.8
portrait 7.8
outside 7.7
walk 7.6
field 7.5
religious 7.5
lady 7.3
girls 7.3
protection 7.3
danger 7.3
group 7.3
activity 7.2
romance 7.1
romantic 7.1
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 96.3
black and white 92.8
person 91
text 90.1
clothing 90.1
white 77.5
car 70.5
black 65.8
footwear 64.9
vehicle 61.7
woman 61.4
old 42.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 84.3%
Calm 47.1%
Sad 19.3%
Happy 12.5%
Confused 10.4%
Surprised 5.4%
Fear 2%
Disgusted 1.9%
Angry 1.4%

AWS Rekognition

Age 43-51
Gender Male, 98%
Calm 81.6%
Sad 12.4%
Confused 2%
Disgusted 1.3%
Angry 0.9%
Surprised 0.9%
Happy 0.6%
Fear 0.4%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 76.4%
Surprised 11.8%
Sad 3.6%
Confused 3%
Disgusted 2%
Fear 1.3%
Angry 1.2%
Happy 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Wheel
Person 99%
Person 98.9%
Person 87.6%
Car 88.5%
Wheel 77.2%

Text analysis

Amazon

VIII
a VIII
a

Google

A VI
A
VI