Human Generated Data

Title

Untitled (children playing dress-up and walking baby carraige on sidewalk)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15592

Human Generated Data

Title

Untitled (children playing dress-up and walking baby carraige on sidewalk)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15592

Machine Generated Data

Tags

Amazon
created on 2022-03-19

Clothing 99.8
Apparel 99.8
Person 99.3
Human 99.3
Person 98.4
Person 98
Person 95
Overcoat 79.7
Coat 79.7
Car 73.6
Transportation 73.6
Vehicle 73.6
Automobile 73.6
Hat 71.2
Sun Hat 66.7
Spoke 58.1
Machine 58.1
Tire 57.9
Suit 56.6

Clarifai
created on 2023-10-29

people 99.9
man 97.8
woman 97.1
group 96.8
child 96.1
adult 96
street 95.4
group together 93.8
monochrome 90.2
family 88.1
walk 86.8
two 84.6
offspring 84.3
three 84.2
administration 83.1
dancing 80.7
leader 80.3
portrait 80.2
veil 79.7
four 79.3

Imagga
created on 2022-03-19

snow 45.8
man 33.6
silhouette 24.8
weather 23.7
male 23.6
people 23.4
winter 19.6
world 18.9
kin 18.1
person 17.6
sport 16.5
cold 16.4
sky 14.7
business 14
boy 13.9
outdoors 13.6
walk 13.3
businessman 13.2
walking 12.3
couple 12.2
landscape 11.9
adult 11.7
active 11.7
sunset 11.7
outdoor 11.5
together 11.4
day 11
child 10.9
mountain 10.8
park 10.7
forest 10.4
office 10.4
men 10.3
season 10.1
tree 10
fun 9.7
life 9.5
women 9.5
friends 9.4
black 9
vacation 9
activity 9
freeze 8.7
full length 8.7
lifestyle 8.7
adventure 8.5
two 8.5
travel 8.5
friendship 8.4
alone 8.2
building 8.2
sun 8.2
group 8.1
team 8.1
water 8
trees 8
love 7.9
sitting 7.7
relaxation 7.5
holding 7.4
ice 7.4
exercise 7.3
recreation 7.2
shadow 7.2
job 7.1
slope 7

Google
created on 2022-03-19

Microsoft
created on 2022-03-19

sky 99.1
outdoor 92
clothing 91.5
man 89.8
person 89.5
black and white 85.4
text 84.9
funeral 77.3
gallery 68.2
picture frame 7.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-10
Gender Male, 97%
Calm 99.7%
Confused 0.1%
Happy 0.1%
Surprised 0%
Angry 0%
Sad 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 4-10
Gender Female, 100%
Calm 92%
Confused 5.2%
Sad 2.4%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 6-16
Gender Male, 98.7%
Angry 89.4%
Confused 4.5%
Sad 3.9%
Calm 1.5%
Happy 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

Microsoft Cognitive Services

Age 4
Gender Male

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Person 99.3%
Person 98.4%
Person 98%
Person 95%
Car 73.6%

Categories