Human Generated Data

Title

Untitled (group of people standing in corner of room behind ribbon)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17066

Human Generated Data

Title

Untitled (group of people standing in corner of room behind ribbon)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17066

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 100
Apparel 100
Person 99
Human 99
Person 98.8
Person 98.6
Person 98.4
Dress 98
Person 97.4
Person 97.1
Person 96.6
Person 96.3
Person 94.5
Female 93.8
Person 93.2
Person 92.5
Person 92.1
Person 91.5
Robe 91.5
Fashion 91.5
Face 90.9
Gown 90.2
Tie 88.7
Accessories 88.7
Accessory 88.7
Skirt 88.2
Wedding 87.4
People 86.3
Person 83.6
Woman 83.5
Bridegroom 82.1
Suit 79.8
Overcoat 79.8
Coat 79.8
Wedding Gown 78.8
Crowd 75
Portrait 70.7
Photography 70.7
Photo 70.7
Person 70.7
Bride 69.9
Costume 67.9
Leisure Activities 64.1
Person 63.6
Girl 60.2
Shoe 59.3
Footwear 59.3
Stage 58.3
Chair 56
Furniture 56
Dance Pose 55.2
Sailor Suit 55.2
Man 55.1
Person 45.6

Clarifai
created on 2023-10-29

people 99.9
group 99.4
group together 99
many 98.3
man 96.6
woman 96
leader 95.2
adult 95.1
child 92.4
several 92.4
administration 89.9
recreation 89.7
education 87.4
family 85.8
street 83.4
actor 83.3
home 82.4
crowd 80.9
boy 79.2
music 75.1

Imagga
created on 2022-02-26

people 30.7
brass 28.4
man 28.2
nurse 27.1
wind instrument 25.4
male 24.1
men 21.5
adult 21.3
musical instrument 21
person 20.4
business 19.4
businessman 18.5
group 17.7
women 16.6
portrait 16.2
human 14.2
happy 13.8
world 13.3
life 12.7
city 12.5
walking 12.3
black 12
motion 12
professional 11.7
suit 11.7
weapon 11.7
team 11.6
sport 11.5
urban 11.4
couple 11.3
corporate 11.2
worker 11
job 10.6
performer 10.6
outdoors 10.4
boy 10.4
kin 10.4
smile 10
dress 9.9
travel 9.9
sword 9.5
day 9.4
smiling 9.4
lifestyle 9.4
uniform 9.2
occupation 9.2
active 9
together 8.8
crowd 8.6
work 8.6
singer 8.5
fashion 8.3
holding 8.2
building 8.2
success 8
family 8
musician 7.9
happiness 7.8
standing 7.8
scene 7.8
attractive 7.7
bride 7.7
clothing 7.5
silhouette 7.4
company 7.4
action 7.4
street 7.4
art 7.3
teenager 7.3
exercise 7.3
color 7.2
handsome 7.1
architecture 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.5
clothing 94.5
standing 90.4
wedding dress 89.9
people 88.4
outdoor 85.4
bride 82.3
woman 79.7
group 73
dress 69.6
text 67.8
man 57
old 43.1
dressed 38.1
clothes 16.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 96.7%
Calm 97.6%
Surprised 1%
Angry 0.5%
Confused 0.3%
Sad 0.2%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Male, 93.1%
Calm 99.4%
Sad 0.4%
Happy 0.1%
Angry 0%
Confused 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 49-57
Gender Male, 67.3%
Calm 99.1%
Happy 0.7%
Sad 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 97.9%
Sad 28.1%
Happy 26.3%
Fear 24.3%
Surprised 13.9%
Calm 2.6%
Angry 2%
Disgusted 1.6%
Confused 1.1%

AWS Rekognition

Age 29-39
Gender Male, 98%
Disgusted 36.2%
Calm 31.1%
Angry 14.6%
Sad 8.2%
Fear 4%
Surprised 2.9%
Confused 1.6%
Happy 1.2%

AWS Rekognition

Age 29-39
Gender Male, 72.7%
Sad 45%
Happy 11.8%
Disgusted 10.9%
Calm 9.3%
Surprised 8.1%
Fear 6.3%
Angry 4.8%
Confused 3.8%

AWS Rekognition

Age 47-53
Gender Male, 53.6%
Calm 98.6%
Sad 0.4%
Happy 0.4%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 85%
Confused 10.3%
Sad 3.3%
Happy 0.5%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Female, 98.4%
Calm 59.6%
Sad 17.4%
Happy 15.1%
Confused 3.1%
Fear 1.3%
Disgusted 1.3%
Angry 1.3%
Surprised 0.9%

AWS Rekognition

Age 24-34
Gender Male, 97.2%
Sad 79.2%
Calm 11.7%
Angry 4.2%
Confused 2.6%
Disgusted 1.1%
Surprised 0.4%
Fear 0.4%
Happy 0.4%

AWS Rekognition

Age 26-36
Gender Male, 94.3%
Sad 82.9%
Calm 7.8%
Fear 2.6%
Happy 2%
Angry 1.8%
Confused 1.4%
Disgusted 0.9%
Surprised 0.6%

AWS Rekognition

Age 29-39
Gender Female, 69.3%
Calm 95.2%
Sad 3.3%
Happy 0.7%
Confused 0.3%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 30-40
Gender Female, 76%
Sad 37.4%
Happy 35%
Confused 14.2%
Calm 3.9%
Angry 3.2%
Fear 2.6%
Surprised 2%
Disgusted 1.9%

AWS Rekognition

Age 40-48
Gender Female, 88%
Calm 91.4%
Surprised 7.2%
Disgusted 0.4%
Sad 0.4%
Angry 0.2%
Confused 0.2%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Shoe
Person 99%
Person 98.8%
Person 98.6%
Person 98.4%
Person 97.4%
Person 97.1%
Person 96.6%
Person 96.3%
Person 94.5%
Person 93.2%
Person 92.5%
Person 92.1%
Person 91.5%
Person 83.6%
Person 70.7%
Person 63.6%
Person 45.6%
Tie 88.7%
Shoe 59.3%

Categories

Text analysis

Amazon

KODAK-SELA