Human Generated Data

Title

Untitled (children lined up on grass outside house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17450

Human Generated Data

Title

Untitled (children lined up on grass outside house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17450

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Person 99.6
Clothing 99.6
Apparel 99.6
Person 99.6
Person 99.4
Person 99.3
Person 99.2
Person 98.8
Person 98.4
Person 97.5
Female 96
Skirt 91.9
Person 90.8
People 85.3
Shorts 84.5
Woman 83.7
Person 79.7
Girl 70
Dress 62.5
Standing 59
Portrait 58.5
Photography 58.5
Face 58.5
Photo 58.5
Kid 58.3
Child 58.3

Clarifai
created on 2023-10-29

people 99.9
child 99.9
group 99
group together 99
boy 96.3
adult 94.4
several 94.1
wear 93.8
recreation 93.8
many 93.4
family 91.2
sibling 90.6
five 90.5
uniform 90.2
man 90
woman 89.9
education 89.5
military 89
war 87.6
four 87.3

Imagga
created on 2022-02-26

kin 36.6
world 28.2
people 26.8
man 22.9
sport 20.8
walking 17
child 16.1
person 16
adult 15.8
men 15.5
male 14.9
couple 14.8
travel 14.8
beach 14.3
old 13.9
group 13.7
outdoors 13.6
family 12.4
silhouette 12.4
athlete 11.8
dress 11.7
sand 10.8
active 10.8
women 10.3
happy 10
outdoor 9.9
vacation 9.8
run 9.6
summer 9.6
black 9.6
love 9.5
boy 8.7
water 8.7
ancient 8.6
art 8.6
portrait 8.4
sunset 8.1
lifestyle 7.9
together 7.9
crowd 7.7
winter 7.7
tourist 7.6
walk 7.6
human 7.5
fun 7.5
life 7.5
action 7.4
competition 7.3
mother 7.3
businessman 7.1
happiness 7
sea 7
architecture 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 94.4
clothing 92.6
text 92.2
footwear 89.3
child 86.3
black and white 59.2
boy 53.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 59%
Sad 45.1%
Calm 44.6%
Disgusted 3%
Confused 2.3%
Surprised 2.1%
Happy 1.7%
Angry 0.8%
Fear 0.4%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 90.3%
Surprised 2.8%
Happy 2.1%
Disgusted 1.2%
Confused 1.1%
Fear 1.1%
Sad 0.9%
Angry 0.5%

AWS Rekognition

Age 18-26
Gender Male, 80.3%
Calm 90.4%
Sad 6.5%
Confused 1%
Happy 0.6%
Surprised 0.5%
Angry 0.4%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Female, 76.8%
Calm 95.1%
Sad 2.3%
Happy 1.7%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Female, 68%
Surprised 48.9%
Calm 13.6%
Happy 13.2%
Sad 9.4%
Confused 8%
Angry 2.9%
Fear 2%
Disgusted 1.9%

AWS Rekognition

Age 43-51
Gender Female, 79.3%
Calm 59.8%
Happy 25.5%
Sad 4.4%
Fear 4%
Angry 2.5%
Disgusted 1.5%
Surprised 1.5%
Confused 0.9%

AWS Rekognition

Age 22-30
Gender Female, 88.1%
Calm 83%
Fear 5.7%
Sad 5.4%
Surprised 2.6%
Confused 1.3%
Happy 1.2%
Disgusted 0.6%
Angry 0.2%

AWS Rekognition

Age 51-59
Gender Female, 54.3%
Calm 99.3%
Sad 0.3%
Happy 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 36-44
Gender Female, 53.8%
Sad 70.6%
Calm 28.2%
Angry 0.3%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%
Happy 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.6%
Person 99.6%
Person 99.4%
Person 99.3%
Person 99.2%
Person 98.8%
Person 98.4%
Person 97.5%
Person 90.8%
Person 79.7%

Categories