Human Generated Data

Title

Untitled (young couples dancing under tent with other couples watching)

Date

1935-1945

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9055

Human Generated Data

Title

Untitled (young couples dancing under tent with other couples watching)

People

Artist: Martin Schweig, American 20th century

Date

1935-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9055

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.9
Apparel 99.9
Dress 99.6
Person 99.5
Human 99.5
Person 99.1
Person 98.9
Shoe 98
Footwear 98
Person 97.8
Female 97.6
Person 97.2
Person 95.9
Person 95.9
Person 93.5
Dance Pose 93.3
Leisure Activities 93.3
Person 92.9
Shorts 92.8
Suit 92
Overcoat 92
Coat 92
Person 91.5
Person 91.3
Person 89.1
Woman 88.6
Person 87.1
Person 86.5
Hand 84.7
Horse 80.9
Animal 80.9
Mammal 80.9
Floor 80.3
Stage 76.7
Girl 72.8
Face 72.8
Person 71.8
People 71.6
Person 70.2
Portrait 69.6
Photography 69.6
Photo 69.6
Crowd 68.4
Outdoors 66.5
Robe 64.9
Fashion 64.9
Gown 64.8
Chair 64.3
Furniture 64.3
Person 63.3
Bridegroom 62.8
Wedding 62.8
Skirt 61.8
Man 60.7
Flooring 59.1
Party 58.4
Pants 58.1
Dance 57.7
Indoors 56.1
Holding Hands 56
Wedding Gown 55.4

Clarifai
created on 2023-10-26

people 99.9
dancing 99.3
man 98.6
dancer 98.3
group together 98.2
many 97.2
woman 96.7
group 96.3
music 95
recreation 94.9
adult 93.6
crowd 93
child 92.3
wear 91.5
spectator 89.9
motion 88.7
enjoyment 83.3
teacher 81.1
education 79.8
street 78.9

Imagga
created on 2022-01-23

sword 75.8
weapon 60
dancer 51
performer 38.9
person 36.4
people 34.6
man 32.9
adult 29.3
entertainer 27.9
male 27
professional 26.9
business 25.5
men 24.9
corporate 24.1
businessman 23
women 21.4
group 21
team 20.6
dress 19
teacher 18.4
happy 17.5
suit 15.3
work 14.9
attractive 14.7
office 14.5
educator 14.3
building 14.3
black 13.9
bride 13.4
pretty 13.3
meeting 13.2
teamwork 13
executive 12.9
fashion 12.8
sport 12.6
job 12.4
together 12.3
couple 12.2
success 12.1
dance 12.1
groom 12
businesswoman 11.8
portrait 11.6
boss 11.5
hand 11.4
hands 11.3
wedding 11
smile 10.7
active 10.6
modern 10.5
formal 10.5
human 10.5
exercise 10
city 10
pose 10
silhouette 9.9
crowd 9.6
legs 9.4
happiness 9.4
smiling 9.4
lifestyle 9.4
communication 9.2
holding 9.1
urban 8.7
standing 8.7
dancing 8.7
day 8.6
sitting 8.6
motion 8.6
marriage 8.5
females 8.5
casual 8.5
two 8.5
elegance 8.4
action 8.3
successful 8.2
style 8.2
worker 8.1
posing 8
model 7.8
full length 7.8
outside 7.7
businesspeople 7.6
career 7.6
outdoors 7.5
manager 7.4
street 7.4
clothing 7.4
indoor 7.3
laptop 7.3
looking 7.2
body 7.2
hall 7.2
handsome 7.1
love 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.6
dance 97.7
clothing 85.7
black and white 83.8
people 79
sport 77.6
standing 75.3
man 74.4
woman 70.7
footwear 64
group 61.6
crowd 47.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 86.9%
Sad 98.5%
Calm 0.5%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%
Happy 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 51-59
Gender Female, 92.1%
Calm 36.7%
Surprised 31%
Fear 14%
Sad 9.9%
Disgusted 2.7%
Angry 2.6%
Confused 2.4%
Happy 0.8%

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Calm 91.5%
Sad 1.9%
Surprised 1.9%
Confused 1.4%
Angry 1.2%
Disgusted 0.9%
Fear 0.8%
Happy 0.5%

AWS Rekognition

Age 29-39
Gender Female, 99.5%
Happy 82%
Calm 5.8%
Disgusted 3.5%
Sad 3.2%
Angry 2.9%
Confused 1.5%
Surprised 0.7%
Fear 0.4%

AWS Rekognition

Age 48-54
Gender Female, 96%
Sad 71.3%
Calm 13.5%
Confused 11.1%
Happy 1.3%
Disgusted 1%
Surprised 0.8%
Angry 0.6%
Fear 0.4%

AWS Rekognition

Age 43-51
Gender Male, 94.1%
Sad 97.3%
Confused 0.9%
Calm 0.9%
Disgusted 0.3%
Angry 0.2%
Happy 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 54-64
Gender Male, 99.5%
Calm 98%
Surprised 1.5%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%
Fear 0.1%
Confused 0.1%
Sad 0%

AWS Rekognition

Age 54-62
Gender Male, 80.6%
Calm 39.8%
Happy 30.8%
Surprised 11%
Sad 9.3%
Disgusted 3.1%
Fear 3%
Confused 1.8%
Angry 1.3%

AWS Rekognition

Age 6-16
Gender Female, 98.4%
Sad 78%
Calm 14.9%
Happy 2.7%
Fear 1.9%
Angry 0.8%
Disgusted 0.7%
Confused 0.6%
Surprised 0.4%

AWS Rekognition

Age 50-58
Gender Male, 92.7%
Sad 87.7%
Calm 11.3%
Confused 0.3%
Happy 0.3%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Male, 79.9%
Surprised 34.2%
Sad 33.9%
Calm 17.4%
Confused 6.8%
Fear 3.7%
Disgusted 1.9%
Happy 1.2%
Angry 1%

AWS Rekognition

Age 50-58
Gender Male, 94.5%
Calm 91.9%
Sad 6.1%
Confused 0.5%
Happy 0.5%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 94.7%
Sad 69.8%
Calm 13.7%
Confused 13.6%
Surprised 0.9%
Angry 0.6%
Disgusted 0.5%
Happy 0.5%
Fear 0.4%

AWS Rekognition

Age 20-28
Gender Female, 98.2%
Fear 85.3%
Sad 9.9%
Surprised 1.9%
Confused 0.7%
Angry 0.6%
Happy 0.6%
Calm 0.6%
Disgusted 0.4%

AWS Rekognition

Age 26-36
Gender Female, 64.4%
Sad 53.3%
Happy 20%
Calm 17.5%
Confused 5.4%
Angry 1.3%
Surprised 0.9%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 24-34
Gender Male, 83.6%
Sad 51.5%
Calm 24.5%
Confused 6.9%
Happy 5.8%
Disgusted 3.5%
Angry 3.2%
Fear 3.1%
Surprised 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 98%
Horse 80.9%

Text analysis

Amazon

8
2" 8 "
P
"
2"
and