Human Generated Data

Title

Tea Dance, Tegernsee

Date

1932 (printed 1980)

People

Artist: Alfred Eisenstaedt, German 1898 - 1995

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Lufthansa German Airlines, 1989.69.5

Copyright

Photo by Alfred Eisenstaedt/The LIFE Picture Collection/Getty Images

Human Generated Data

Title

Tea Dance, Tegernsee

People

Artist: Alfred Eisenstaedt, German 1898 - 1995

Date

1932 (printed 1980)

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Lufthansa German Airlines, 1989.69.5

Copyright

Photo by Alfred Eisenstaedt/The LIFE Picture Collection/Getty Images

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Dance Pose 99.8
Leisure Activities 99.8
Person 99.7
Human 99.7
Person 99.7
Person 99.4
Person 98
Dance 97.4
Tango 96.4
Shoe 95.5
Clothing 95.5
Footwear 95.5
Apparel 95.5
Person 93.2
Person 92.1
Chair 89.2
Furniture 89.2
Person 86.4
Person 85.3
Person 80.5
Person 77.2
People 66.7
Shoe 65.6
Meal 56.8
Food 56.8
Musician 55.2
Musical Instrument 55.2

Clarifai
created on 2023-10-26

people 100
many 99.5
child 99.5
group 98.8
woman 98.7
man 98.6
group together 98.3
music 98.1
adult 98
several 97.3
recreation 96.6
dancing 94.7
wear 94
administration 93.2
education 91.9
leader 90.2
musician 88.1
spectator 87.8
enjoyment 83.4
two 83.3

Imagga
created on 2022-01-23

brass 51.9
wind instrument 45.3
horn 43
cornet 40.8
musical instrument 38.4
device 36.4
man 28.2
instrumentality 25.4
male 22.7
silhouette 19
people 19
sport 18.6
artifact 16.9
person 16.9
adult 16.9
wheeled vehicle 15.5
outdoor 14.5
active 12.9
outdoors 12.8
black 12.6
leisure 12.4
lifestyle 12.3
play 12.1
unicycle 11.8
recreation 11.6
men 11.2
guitar 11.1
music 11
vehicle 10.9
sky 10.8
ball 10.2
stringed instrument 10
trombone 9.9
musician 9.9
sunset 9.9
equipment 9.7
player 9.6
boy 9.6
day 9.4
violin 9.4
portrait 9.1
lady 8.9
game 8.9
happy 8.8
couple 8.7
jumping 8.7
water 8.7
bowed stringed instrument 8.5
hand 8.4
park 8.3
entertainment 8.3
competition 8.2
skateboard 8.2
teenager 8.2
playing 8.2
whole 8.2
group 8.1
cool 8
conveyance 7.9
professional 7.8
outside 7.7
club 7.5
fun 7.5
one 7.5
landscape 7.4
holding 7.4
action 7.4
teen 7.3
business 7.3
success 7.2
looking 7.2
team 7.2
businessman 7.1
together 7

Google
created on 2022-01-23

Musical instrument 89.8
Coat 89.8
Black 89.6
Dress 89
Tree 87.3
Black-and-white 86.2
Chair 85.9
Musician 85.8
Style 84.2
Suit 82.1
Sky 81.3
Music 81.2
Cello 80.8
Monochrome 79.5
Violin family 79
Entertainment 78.5
Monochrome photography 78.1
Dance 77
Snapshot 74.3
Event 73.6

Microsoft
created on 2022-01-23

sky 99.8
outdoor 98.5
person 94.3
text 92.9
clothing 92.4
man 82
woman 79
people 73
black and white 66.2
group 57.5
footwear 57.1
tree 50.7
crowd 0.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 99.7%
Sad 92%
Confused 4.4%
Calm 1.4%
Angry 1.1%
Surprised 0.4%
Disgusted 0.3%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 23-31
Gender Male, 100%
Calm 55%
Sad 36.8%
Angry 2.9%
Confused 2.7%
Fear 0.8%
Happy 0.7%
Surprised 0.6%
Disgusted 0.5%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Sad 86.8%
Confused 6.8%
Disgusted 3.7%
Angry 1%
Fear 0.8%
Surprised 0.5%
Calm 0.2%
Happy 0.2%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Sad 81.6%
Calm 12.2%
Angry 2.3%
Surprised 1.4%
Confused 1%
Disgusted 0.6%
Fear 0.5%
Happy 0.3%

AWS Rekognition

Age 26-36
Gender Female, 99.9%
Calm 58%
Confused 18%
Angry 5.7%
Fear 5.4%
Disgusted 5.3%
Sad 2.9%
Surprised 2.5%
Happy 2.2%

AWS Rekognition

Age 28-38
Gender Male, 99.3%
Calm 96.1%
Sad 2.4%
Happy 0.5%
Disgusted 0.3%
Confused 0.3%
Surprised 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 22-30
Gender Female, 100%
Disgusted 28.6%
Calm 23.5%
Happy 19.5%
Sad 12.6%
Fear 7.8%
Confused 4.3%
Angry 2.1%
Surprised 1.6%

AWS Rekognition

Age 19-27
Gender Male, 60.3%
Sad 28%
Angry 26%
Calm 21.8%
Confused 8.1%
Disgusted 5.5%
Fear 5%
Happy 3.1%
Surprised 2.6%

AWS Rekognition

Age 6-14
Gender Female, 100%
Sad 97.8%
Calm 0.6%
Happy 0.5%
Confused 0.3%
Fear 0.3%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%
Shoe 95.5%
Chair 89.2%

Categories