Human Generated Data

Title

Untitled (couples dancing to live music in large hall at Gulf Oil Corp. picnic, Luling, Texas)

Date

1951

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3005

Human Generated Data

Title

Untitled (couples dancing to live music in large hall at Gulf Oil Corp. picnic, Luling, Texas)

People

Artist: Harry Annas, American 1897 - 1980

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3005

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Clothing 99.7
Apparel 99.7
Person 99.6
Human 99.6
Person 99.4
Person 99.4
Person 97.8
Person 97.6
Suit 96.4
Overcoat 96.4
Coat 96.4
Person 96.3
Dress 93.5
Person 88.9
Person 87.9
Wedding 87
Robe 86.8
Fashion 86.8
Person 85.9
Gown 84.9
Dance Pose 79.8
Leisure Activities 79.8
Female 79.7
Person 78
Bridegroom 77
Tuxedo 76.5
Person 75.8
Wedding Gown 74.5
Crowd 73.9
Person 71.2
People 70
Person 68.7
Party 65.5
Chair 64.5
Furniture 64.5
Woman 62.9
Portrait 60.1
Face 60.1
Photography 60.1
Photo 60.1
Dance 59.6
Bride 59.5
Person 58.7
Man 57.8
Person 57.6
Floor 55.9
Person 52.2
Person 50.3

Clarifai
created on 2023-10-26

people 99.9
group 99.3
many 98.9
group together 98.6
woman 98
man 97.5
adult 95
education 94.7
child 93.7
school 93.7
crowd 88.4
indoors 87.8
room 87.2
several 87
five 85.8
recreation 83.2
adolescent 80.7
elementary school 80.6
monochrome 79.5
dancing 79.4

Imagga
created on 2022-01-21

photographer 48.7
brass 30.4
people 30.1
wind instrument 24.4
man 23.5
business 21.9
men 20.6
urban 19.2
group 18.5
city 18.3
cornet 17.9
male 17.7
person 17.1
musical instrument 17
women 16.6
crowd 16.3
black 15.8
adult 15.7
silhouette 14.9
travel 14.1
life 12.5
walking 12.3
suit 11.8
team 11.6
businessman 11.5
clothing 11.5
professional 11.4
office 11.3
spectator 11
street 11
trombone 10.9
world 10.8
corporate 10.3
work 10.2
occupation 10.1
station 9.7
success 9.6
meeting 9.4
teamwork 9.3
power 9.2
pedestrian 9.1
industrial 9.1
human 9
transportation 9
activity 8.9
job 8.8
working 8.8
scene 8.6
window 8.4
mask 8.3
active 8.3
danger 8.2
happy 8.1
building 7.7
party 7.7
train 7.7
move 7.7
walk 7.6
lifestyles 7.6
fashion 7.5
art 7.4
manager 7.4
tourist 7.4
protection 7.3
dirty 7.2
portrait 7.1
happiness 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

outdoor 97.8
person 94.9
clothing 94.7
footwear 88.4
man 84.9
black and white 84.5
basketball 73.5
woman 72.9
dance 69.6
people 69

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 91.9%
Calm 93.5%
Happy 3.2%
Fear 0.9%
Confused 0.6%
Sad 0.5%
Angry 0.5%
Surprised 0.5%
Disgusted 0.3%

AWS Rekognition

Age 26-36
Gender Female, 64.9%
Calm 77.9%
Sad 10.9%
Happy 5.7%
Confused 1.8%
Angry 1.4%
Surprised 1%
Fear 0.9%
Disgusted 0.4%

AWS Rekognition

Age 28-38
Gender Female, 55.3%
Sad 92%
Angry 2.1%
Disgusted 1.5%
Calm 1.5%
Confused 1.2%
Happy 0.8%
Fear 0.5%
Surprised 0.4%

AWS Rekognition

Age 34-42
Gender Male, 78%
Calm 87.4%
Sad 5.1%
Angry 2%
Fear 1.2%
Happy 1.2%
Confused 1.1%
Disgusted 1.1%
Surprised 0.8%

AWS Rekognition

Age 12-20
Gender Female, 79.1%
Calm 92.8%
Happy 2.7%
Sad 2%
Confused 0.8%
Angry 0.6%
Disgusted 0.6%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 18-24
Gender Female, 95.2%
Calm 87.9%
Sad 5.4%
Happy 1.9%
Angry 1.7%
Surprised 1%
Confused 0.8%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 27-37
Gender Female, 88.3%
Calm 56.4%
Happy 18.7%
Sad 17.9%
Confused 4.1%
Angry 0.9%
Disgusted 0.8%
Surprised 0.7%
Fear 0.5%

AWS Rekognition

Age 39-47
Gender Male, 90.6%
Calm 74.5%
Sad 19.9%
Angry 1.5%
Happy 1%
Confused 1%
Fear 0.8%
Surprised 0.7%
Disgusted 0.6%

AWS Rekognition

Age 35-43
Gender Male, 96.2%
Calm 94.6%
Confused 1.7%
Disgusted 1.4%
Sad 0.6%
Happy 0.6%
Angry 0.6%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 19-27
Gender Male, 91.2%
Calm 63.6%
Fear 13.9%
Sad 12.2%
Confused 4.4%
Happy 2%
Disgusted 1.7%
Surprised 1.3%
Angry 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

7
KODAK
SAFETY