Human Generated Data

Title

Untitled (couples dancing, seen from above)

Date

1951

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17944

Human Generated Data

Title

Untitled (couples dancing, seen from above)

People

Artist: Lucian and Mary Brown, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17944

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Clothing 100
Apparel 100
Human 99.3
Person 99
Gown 98.9
Fashion 98.9
Robe 98.9
Dress 98.8
Person 98.5
Wedding 98.3
Female 98.3
Person 98.2
Bride 97.8
Wedding Gown 97.8
Person 97.2
Person 97.1
Person 96.9
Person 96.5
Person 96.2
Person 96
Person 94.2
Person 93.4
Person 92.3
Woman 92
Person 91.4
Person 90.1
Suit 89.4
Overcoat 89.4
Coat 89.4
Bridegroom 83.5
Person 83.3
Person 76.9
Person 76.6
Person 72.5
Person 71.2
Person 69
Person 68
Person 67.6
Person 67.1
Person 66.9
Person 65.1
Leisure Activities 62.8
Portrait 60.8
Face 60.8
Photography 60.8
Photo 60.8
Floor 59.5
Dance Pose 56.3

Clarifai
created on 2023-10-29

people 99.8
woman 99.4
group 98.8
dress 97.2
adult 97
many 95.5
dancing 95.5
man 94.1
music 93.1
group together 92.4
girl 91.7
fashion 91.2
wear 90.7
model 90.3
child 89
education 88.9
bridesmaid 87.7
school 87.2
dancer 85.6
crowd 84.9

Imagga
created on 2022-03-04

brass 82.5
wind instrument 66.1
musical instrument 44.1
people 34
group 29.8
men 28.3
man 26.9
business 26.1
trombone 23.6
performer 23.6
male 22
cornet 21.9
person 20.6
women 20.5
businessman 20.3
adult 20.2
bugle 20
musician 19.6
outfit 18.7
singer 18.1
silhouette 17.4
entertainer 17.3
suit 17.1
team 17
black 15.6
corporate 15.5
happy 14.4
professional 14.3
urban 14
success 13.7
crowd 13.4
dancer 13.1
fashion 12.8
human 12.7
couple 12.2
teamwork 12
attractive 11.9
work 11.8
city 11.6
friends 10.3
executive 10.1
smile 10
music 9.8
fun 9.7
standing 9.6
party 9.4
smiling 9.4
happiness 9.4
youth 9.4
baritone 9.1
handsome 8.9
office 8.8
clothing 8.8
building 8.7
lifestyle 8.7
scene 8.6
device 8.5
portrait 8.4
street 8.3
shopping 8.3
holding 8.2
life 8.2
style 8.2
cute 7.9
pretty 7.7
casual 7.6
joy 7.5
leisure 7.5
teen 7.3
teenager 7.3
confident 7.3
dress 7.2
cool 7.1
job 7.1
travel 7
indoors 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

person 98.9
clothing 95.7
woman 90.3
dress 82.5
footwear 70.7
text 68.5
people 66.3
group 64.2
dance 61
crowd 26.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 94.4%
Calm 99.3%
Sad 0.4%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 99.1%
Calm 62.9%
Happy 22.7%
Sad 6.8%
Disgusted 4.5%
Angry 1%
Fear 0.7%
Surprised 0.7%
Confused 0.7%

AWS Rekognition

Age 34-42
Gender Male, 53.4%
Calm 89.8%
Sad 6.1%
Confused 1.8%
Happy 1%
Surprised 0.4%
Disgusted 0.4%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 22-30
Gender Male, 87.2%
Sad 68.3%
Calm 25.8%
Disgusted 1.7%
Angry 1.5%
Fear 1%
Confused 0.6%
Happy 0.6%
Surprised 0.4%

AWS Rekognition

Age 22-30
Gender Female, 75.4%
Calm 97%
Sad 1.5%
Confused 0.5%
Happy 0.4%
Disgusted 0.3%
Surprised 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 10-18
Gender Female, 75.5%
Sad 82.7%
Confused 7%
Calm 3.5%
Angry 2.1%
Fear 2%
Happy 0.9%
Disgusted 0.9%
Surprised 0.9%

AWS Rekognition

Age 22-30
Gender Male, 96.5%
Calm 93.2%
Sad 4.7%
Confused 0.8%
Disgusted 0.4%
Surprised 0.4%
Angry 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 91.5%
Calm 78.5%
Sad 4.5%
Surprised 4.2%
Fear 3.5%
Happy 3.1%
Angry 2.3%
Disgusted 2.1%
Confused 1.8%

AWS Rekognition

Age 37-45
Gender Male, 75.5%
Calm 94.8%
Happy 1.6%
Surprised 1%
Fear 0.8%
Disgusted 0.6%
Confused 0.4%
Sad 0.4%
Angry 0.3%

AWS Rekognition

Age 31-41
Gender Male, 93.9%
Calm 65.6%
Sad 17%
Confused 7.4%
Disgusted 3.4%
Fear 2.5%
Happy 1.8%
Angry 1.3%
Surprised 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99%
Person 98.5%
Person 98.2%
Person 97.2%
Person 97.1%
Person 96.9%
Person 96.5%
Person 96.2%
Person 96%
Person 94.2%
Person 93.4%
Person 92.3%
Person 91.4%
Person 90.1%
Person 83.3%
Person 76.9%
Person 76.6%
Person 72.5%
Person 71.2%
Person 69%
Person 68%
Person 67.6%
Person 67.1%
Person 66.9%
Person 65.1%

Categories

Text analysis

Amazon

Je
VAGOY