Human Generated Data

Title

Untitled (group of men at bar)

Date

1954

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1565

Human Generated Data

Title

Untitled (group of men at bar)

People

Artist: John Deusing, American active 1940s

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1565

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Person 99.5
Person 99.2
Person 98.8
Person 97.1
Person 95.8
Clothing 94.3
Apparel 94.3
Clinic 93
Person 85.2
People 76.2
Person 75.1
Indoors 68.3
Hospital 67.2
Person 64.3
Portrait 61.1
Photography 61.1
Face 61.1
Photo 61.1
Operating Theatre 59.2
Doctor 59
Person 56.1
Room 55.2

Clarifai
created on 2023-10-15

people 99.8
group 99.4
man 98.1
woman 96.7
adult 96.4
many 95.5
group together 95.4
indoors 89.7
monochrome 89.2
leader 86.4
several 86.1
sit 83
wedding 83
administration 78.7
military 77.1
child 72.5
five 68.1
recreation 65.9
furniture 65.1
education 64.2

Imagga
created on 2021-12-14

man 39.7
male 31.9
people 31.8
adult 28.3
person 27.4
office 24.3
business 23.1
businessman 22
professional 21.3
nurse 20.9
meeting 20.7
happy 20.7
smiling 19.5
room 19.1
work 18.8
working 18.5
brass 18.4
team 17.9
casual 17.8
indoors 17.6
women 17.4
lifestyle 16.6
men 16.3
desk 16.3
businesspeople 16.1
home 16.1
computer 16
table 15.7
smile 15.7
laptop 15.5
businesswoman 15.4
wind instrument 15.3
cheerful 14.6
corporate 14.6
portrait 13.6
sitting 12.9
looking 12.8
colleagues 12.6
communication 12.6
job 12.4
interior 12.4
medical 12.3
talking 12.3
couple 12.2
manager 12.1
group 12.1
modern 11.9
occupation 11.9
worker 11.9
musical instrument 11.3
mature 11.1
teamwork 11.1
executive 11
40s 10.7
education 10.4
patient 10.3
happiness 10.2
camera 10.2
indoor 10
house 10
face 9.9
casual clothing 9.8
two people 9.7
specialist 9.6
30s 9.6
doctor 9.4
two 9.3
student 9.2
horizontal 9.2
20s 9.2
confident 9.1
holding 9.1
handsome 8.9
classroom 8.8
discussion 8.8
mid adult 8.7
day 8.6
cornet 8.5
togetherness 8.5
attractive 8.4
hand 8.3
color 8.3
health 8.3
human 8.2
technology 8.2
suit 8.1
glass 8
to 8
medicine 7.9
standing 7.8
staff 7.8
clothing 7.7
bed 7.6
career 7.6
senior 7.5
one 7.5
device 7.4
alone 7.3
life 7.3
bright 7.1
together 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 95.6
person 94.8
indoor 91.9
drawing 91.3
text 90.6
man 87.7
clothing 87.3
sketch 77.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 17-29
Gender Female, 56.8%
Calm 64.1%
Sad 19.3%
Happy 9.5%
Angry 4.1%
Fear 1%
Confused 0.9%
Surprised 0.9%
Disgusted 0.3%

AWS Rekognition

Age 25-39
Gender Female, 64.9%
Surprised 35.6%
Disgusted 30.1%
Angry 13.6%
Sad 9.4%
Calm 5.4%
Confused 3.4%
Happy 1.4%
Fear 1.1%

AWS Rekognition

Age 23-37
Gender Male, 86%
Sad 41.6%
Happy 28.8%
Calm 13.8%
Confused 6.4%
Surprised 4.6%
Angry 2.4%
Fear 1.6%
Disgusted 0.8%

AWS Rekognition

Age 20-32
Gender Male, 98.7%
Calm 59.1%
Sad 39%
Happy 0.8%
Angry 0.5%
Surprised 0.3%
Confused 0.3%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 24-38
Gender Male, 93.2%
Calm 60.6%
Sad 35.8%
Happy 1.5%
Confused 1.1%
Angry 0.6%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 33-49
Gender Male, 67.8%
Sad 76.5%
Calm 15.3%
Surprised 2.9%
Confused 1.8%
Happy 1.3%
Angry 1%
Fear 0.9%
Disgusted 0.2%

AWS Rekognition

Age 47-65
Gender Female, 70.4%
Calm 78.7%
Sad 20.3%
Happy 0.7%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 22-34
Gender Male, 74.3%
Sad 47.7%
Calm 22%
Confused 17.9%
Disgusted 5.3%
Surprised 3%
Happy 2.4%
Angry 1%
Fear 0.8%

AWS Rekognition

Age 18-30
Gender Male, 90.7%
Calm 95.9%
Happy 3.6%
Sad 0.2%
Angry 0.2%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

.PT
KODAK-VEEA--EITW