Human Generated Data

Title

Untitled (couples dancing at debutante ball)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19285

Human Generated Data

Title

Untitled (couples dancing at debutante ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19285

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.2
Human 99.2
Person 98.9
Person 98.8
Person 98.7
Person 98
Person 97.5
Person 97.3
Person 96.5
Dance Pose 96.3
Leisure Activities 96.3
Person 95.8
Person 93.5
Person 87.4
Shoe 85.4
Footwear 85.4
Clothing 85.4
Apparel 85.4
Stage 84.7
Person 83.3
Person 83
Person 76.5
Suit 74.1
Overcoat 74.1
Coat 74.1
Poster 67.9
Advertisement 67.9
Evening Dress 66.2
Fashion 66.2
Gown 66.2
Robe 66.2
Shoe 66
Text 62.6
People 60.9
Dance 57.7
Tuxedo 57.4
Wood 57
Portrait 56.2
Photography 56.2
Face 56.2
Photo 56.2

Clarifai
created on 2023-10-22

people 99.5
man 96.9
group 95
woman 94.9
adult 93.8
business 90.5
education 90.3
meeting 90.1
indoors 89.4
actor 85.3
office 84.7
group together 83.3
room 82.7
teamwork 80.5
partnership 80.1
presentation 78.7
leader 78
portrait 77.9
squad 70.2
success 69.6

Imagga
created on 2022-02-25

people 37.9
silhouette 36.4
business 35.2
businessman 30
group 29.8
men 27.5
man 26.9
team 26.9
male 25.5
person 21.7
corporate 21.5
work 21.2
professional 20.8
adult 20.5
brass 19.2
crowd 19.2
success 18.5
teacher 18.2
women 18.2
meeting 17.9
wind instrument 16.7
suit 16.2
hall 16
black 15.9
cornet 15.1
teamwork 14.8
office 14.6
manager 14
communication 13.4
educator 13.4
company 13
musical instrument 12.9
silhouettes 12.6
boss 12.4
job 12.4
businesswoman 11.8
life 11.6
career 11.4
world 11.4
human 11.2
sunset 9.9
employee 9.7
design 9.6
happy 9.4
presentation 9.3
executive 9.2
city 9.1
modern 9.1
shadow 9
photographer 8.9
businessmen 8.8
colleagues 8.7
couple 8.7
standing 8.7
leadership 8.6
walk 8.6
secretary 8.3
occupation 8.2
window 8.2
building 8
corporation 7.7
youth 7.7
finance 7.6
walking 7.6
outfit 7.6
fashion 7.5
room 7.5
light 7.4
graphic 7.3
girls 7.3
global 7.3
worker 7.2
interior 7.1
together 7
trainer 7

Google
created on 2022-02-25

Coat 89.8
Gesture 85.3
Font 81.9
Art 81
Suit 80.3
Formal wear 74
Event 73.2
Monochrome photography 67.5
Stock photography 63.5
Monochrome 62.3
Team 61.6
Room 60.7
History 60.5
Photo caption 59.9
Visual arts 59.5
Crew 54.9
Illustration 54.9
Rectangle 50.4

Microsoft
created on 2022-02-25

person 97.2
text 96.1
clothing 93.4
man 89.3
dance 80.9
standing 80
suit 64.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 88.6%
Calm 83.8%
Surprised 4.8%
Sad 3%
Disgusted 2.1%
Angry 2.1%
Confused 1.8%
Fear 1.7%
Happy 0.7%

AWS Rekognition

Age 54-62
Gender Female, 70.4%
Calm 95.9%
Sad 4%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%
Surprised 0%
Confused 0%

AWS Rekognition

Age 23-31
Gender Male, 98.7%
Happy 71.3%
Calm 20%
Sad 2.5%
Surprised 2.4%
Fear 1.5%
Confused 1%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 25-35
Gender Male, 95.2%
Disgusted 81.7%
Confused 10.4%
Sad 3%
Happy 1.8%
Surprised 1.4%
Fear 1.1%
Angry 0.5%
Calm 0.1%

AWS Rekognition

Age 24-34
Gender Male, 72.4%
Disgusted 34.5%
Happy 27%
Fear 12.5%
Surprised 7.9%
Calm 6.7%
Confused 4.3%
Sad 3.7%
Angry 3.4%

AWS Rekognition

Age 48-56
Gender Male, 94.1%
Calm 47.4%
Sad 34.8%
Disgusted 8.7%
Fear 3.1%
Confused 3%
Happy 1.1%
Surprised 0.9%
Angry 0.9%

AWS Rekognition

Age 13-21
Gender Male, 99.2%
Sad 45.4%
Angry 18.9%
Calm 15.9%
Fear 6.7%
Disgusted 6.3%
Confused 4.3%
Surprised 1.5%
Happy 1%

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Disgusted 30.2%
Surprised 23.5%
Calm 13.5%
Confused 11.8%
Sad 8.3%
Angry 6.4%
Happy 4.1%
Fear 2.2%

AWS Rekognition

Age 19-27
Gender Male, 87.3%
Sad 55.9%
Calm 39.4%
Confused 1.3%
Happy 1%
Disgusted 1%
Surprised 0.9%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 9-17
Gender Female, 95.1%
Calm 39.1%
Sad 26.9%
Disgusted 20.5%
Happy 5.3%
Surprised 3.7%
Angry 1.7%
Fear 1.5%
Confused 1.2%

AWS Rekognition

Age 29-39
Gender Male, 85.1%
Sad 86.2%
Calm 3.7%
Disgusted 3.6%
Confused 2.9%
Angry 1.5%
Fear 1.2%
Surprised 0.5%
Happy 0.5%

AWS Rekognition

Age 6-16
Gender Male, 59.5%
Calm 61.2%
Sad 25.6%
Confused 3.4%
Happy 3.4%
Fear 3.2%
Disgusted 1.7%
Angry 0.9%
Surprised 0.7%

AWS Rekognition

Age 23-31
Gender Female, 98.8%
Calm 62.7%
Sad 22%
Surprised 9.5%
Happy 2.6%
Confused 1.6%
Disgusted 0.8%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 16-22
Gender Male, 99.3%
Calm 40.9%
Sad 36.7%
Confused 9%
Surprised 5.2%
Disgusted 3.8%
Angry 2.5%
Happy 1%
Fear 0.9%

AWS Rekognition

Age 18-26
Gender Female, 55.4%
Sad 68.6%
Fear 14.2%
Calm 11.6%
Angry 2.6%
Disgusted 1.3%
Happy 1.3%
Surprised 0.3%
Confused 0.2%

AWS Rekognition

Age 13-21
Gender Male, 66.2%
Calm 98.6%
Fear 1%
Sad 0.1%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Shoe
Poster
Person 99.2%
Person 98.9%
Person 98.8%
Person 98.7%
Person 98%
Person 97.5%
Person 97.3%
Person 96.5%
Person 95.8%
Person 93.5%
Person 87.4%
Person 83.3%
Person 83%
Person 76.5%
Shoe 85.4%
Shoe 66%
Poster 67.9%

Categories

Text analysis

Amazon

JAN
65
EXIT
REI
801

Google

108 137
108
137