Human Generated Data

Title

Untitled (group of men dressed as women)

Date

March 15, 1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18068

Human Generated Data

Title

Untitled (group of men dressed as women)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

March 15, 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18068

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Clothing 99.6
Apparel 99.6
Person 99.2
Human 99.2
Person 98.8
Person 98
Person 97
Person 96.4
Person 95.1
Person 94.8
Person 90.2
Person 89.9
Person 87.6
Person 87.1
Person 81.5
Bed 81
Furniture 81
Female 77.9
Hat 76.3
Face 66.6
Overcoat 64
Coat 64
People 62.2
Girl 62.1
Helmet 61.8
Shorts 59.5
Costume 59.2
Leisure Activities 57.1
Woman 56.5
Crowd 55.1

Clarifai
created on 2023-10-29

people 99.9
group 99.1
adult 98.4
woman 98.3
group together 97.1
monochrome 97
man 96.7
many 94.7
wear 93.6
furniture 92.6
child 90.1
administration 89.1
several 89.1
music 87.1
recreation 83.7
actress 83.4
outfit 82.2
room 81.4
education 81.3
indoors 81.2

Imagga
created on 2022-03-04

business 38.9
man 35.6
businessman 34.4
office 34.1
people 29
person 27.6
professional 27.6
brass 27.5
laptop 27.1
businesswoman 26.4
computer 25.9
adult 25.7
work 25.1
corporate 24.9
wind instrument 24.6
male 24.1
job 21.2
businesspeople 19.9
team 19.7
musical instrument 19.7
group 18.5
cornet 18.5
looking 18.4
happy 18.2
sitting 18
executive 17.7
career 17
worker 17
desk 17
communication 16.8
working 16.8
indoors 16.7
teamwork 16.7
indoor 16.4
smile 16.4
meeting 16
room 15.3
blackboard 15.1
men 14.6
modern 14
smiling 13.7
employee 13.6
women 13.4
attractive 13.3
teacher 13
suit 12.7
success 12.1
confident 11.8
handsome 11.6
lifestyle 11.6
boss 11.5
tie 11.4
chair 11.4
couple 11.3
successful 11
black 10.7
technology 10.4
friendly 10.1
center 9.9
fashion 9.8
pretty 9.8
interior 9.7
colleagues 9.7
workers 9.7
table 9.5
student 9.4
portrait 9.1
home 8.8
together 8.8
singer 8.7
corporation 8.7
formal 8.6
casual 8.5
style 8.2
photographer 8
receptionist 7.9
coworkers 7.9
education 7.8
support 7.8
assistant 7.8
classroom 7.7
leader 7.7
two 7.6
workplace 7.6
ethnic 7.6
educator 7.5
study 7.5
silhouette 7.4
holding 7.4
glasses 7.4
cheerful 7.3
alone 7.3
lady 7.3
musician 7.3
trombone 7.3
building 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

clothing 97.2
person 97
text 92.6
indoor 89.4
man 85.1
funeral 78.1
black and white 76.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 86%
Surprised 5.7%
Happy 2.6%
Fear 2%
Disgusted 1.4%
Angry 0.8%
Sad 0.8%
Confused 0.7%

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 47.9%
Surprised 39%
Confused 6.8%
Disgusted 2.6%
Sad 2%
Fear 0.7%
Angry 0.5%
Happy 0.5%

AWS Rekognition

Age 33-41
Gender Male, 99.4%
Calm 45.4%
Sad 20.5%
Surprised 19.8%
Happy 6.4%
Confused 4.4%
Angry 1.3%
Disgusted 1.2%
Fear 1%

AWS Rekognition

Age 39-47
Gender Male, 99.5%
Sad 74.9%
Confused 10.6%
Calm 7.1%
Disgusted 1.8%
Angry 1.7%
Fear 1.6%
Surprised 1.3%
Happy 0.9%

AWS Rekognition

Age 25-35
Gender Male, 66.7%
Calm 99.4%
Happy 0.2%
Sad 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 38-46
Gender Male, 94.7%
Calm 84.2%
Fear 7.6%
Happy 2.1%
Surprised 1.7%
Confused 1.6%
Sad 1.5%
Disgusted 0.9%
Angry 0.5%

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Sad 50.8%
Calm 21%
Happy 16.7%
Confused 3.5%
Surprised 2.7%
Fear 2.3%
Disgusted 1.8%
Angry 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Bed
Hat
Helmet
Person 99.2%
Person 98.8%
Person 98%
Person 97%
Person 96.4%
Person 95.1%
Person 94.8%
Person 90.2%
Person 89.9%
Person 87.6%
Person 87.1%
Person 81.5%
Bed 81%
Hat 76.3%
Helmet 61.8%

Text analysis

Amazon

JOO
KAOOK