Human Generated Data

Title

Untitled (businessmen around conference table)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20135

Human Generated Data

Title

Untitled (businessmen around conference table)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.3
Person 99.3
Person 99.2
Person 99
Person 98.9
Person 98.5
Person 97.3
Person 96.7
Person 96.2
Tie 91
Accessory 91
Accessories 91
Clinic 84.8
Doctor 74.9
Tie 73
Nurse 59.1
Face 57.9
Hospital 57
Portrait 56.2
Photography 56.2
Photo 56.2

Imagga
created on 2022-03-05

lab coat 100
coat 82.9
man 51.1
specialist 42.7
male 40.5
garment 37
people 31.8
person 31.1
businessman 30.9
adult 30.9
nurse 30.3
business 28.6
colleagues 28.2
office 27.3
indoors 26.4
clothing 25.8
desk 24.6
meeting 23.6
patient 23.2
smiling 23.2
doctor 22.6
team 22.4
working 22.1
happy 22
professional 21.6
sitting 21.5
businesswoman 20.9
businesspeople 20.9
table 19.9
teamwork 19.5
group 19.4
indoor 18.3
hospital 18.1
men 18
casual 17.8
medical 17.7
room 17.6
lifestyle 16.6
education 16.5
color 16.1
mature 15.8
associates 15.8
coworkers 15.7
work 15.7
practitioner 15.6
30s 15.4
looking 15.2
job 15.1
portrait 14.9
40s 14.6
corporate 14.6
laptop 14.6
mid adult 14.5
home 14.4
women 14.2
senior 14.1
health 13.9
camera 13.9
computer 13.6
worker 13.4
teacher 12.9
20s 12.8
day 12.6
student 12.4
talking 12.4
together 12.3
couple 12.2
clinic 12
occupation 11.9
happiness 11.8
suit 11.7
discussion 11.7
four 11.5
medicine 11.5
smile 11.4
cheerful 11.4
standing 11.3
case 11.1
holding 10.7
bright 10.7
face 10.7
executive 10.5
elderly 10.5
ethnic 10.5
sick person 10.4
horizontal 10.1
boardroom 9.9
attractive 9.8
thirties 9.7
successful 9.2
classroom 9.2
modern 9.1
care 9.1
business people 8.9
collaboration 8.9
explaining 8.9
to 8.9
30 35 years 8.9
discussing 8.8
businessmen 8.8
expertise 8.7
cooperation 8.7
concentration 8.7
busy 8.7
adults 8.5
manager 8.4
technology 8.2
school 8.1
handsome 8
four people 7.9
multi ethnic group 7.9
doctors 7.9
forties 7.9
good mood 7.8
50s 7.8
two people 7.8
employee 7.7
diversity 7.7
twenties 7.7
two 7.6
workplace 7.6
smart 7.5
coffee 7.4
focus 7.4
inside 7.4
confident 7.3
success 7.2
board 7.2
interior 7.1
surgeon 7

Google
created on 2022-03-05

White 92.2
Black 89.6
Table 86
Black-and-white 82.6
Monochrome 73.1
Service 72.7
Monochrome photography 72.4
Event 70.2
Crew 70.2
Room 67.7
Hat 67.5
Team 66
Vintage clothing 65.4
Sitting 64
T-shirt 62.2
History 61.8
Uniform 61
Cooking 60.7
Health care 59.4
Job 59.1

Microsoft
created on 2022-03-05

person 99.5
clothing 89.2
text 87.4
man 83.6
line 17.8

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Calm 93.6%
Sad 3.7%
Confused 0.8%
Angry 0.6%
Surprised 0.4%
Disgusted 0.4%
Happy 0.4%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Calm 87.7%
Confused 9.1%
Surprised 1.6%
Happy 0.9%
Disgusted 0.2%
Sad 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Male, 96.7%
Calm 99.8%
Surprised 0.1%
Happy 0%
Disgusted 0%
Sad 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 50-58
Gender Male, 86.3%
Calm 100%
Sad 0%
Happy 0%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 99.4%
Calm 81.2%
Sad 11.4%
Confused 5.5%
Happy 1%
Disgusted 0.4%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Female, 50.9%
Happy 41.6%
Sad 31%
Calm 7.7%
Confused 6.9%
Disgusted 6.1%
Angry 4.2%
Surprised 1.5%
Fear 1%

AWS Rekognition

Age 34-42
Gender Female, 86.8%
Calm 95.5%
Happy 2.5%
Surprised 0.6%
Sad 0.3%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 91%

Captions

Microsoft

a group of people in a room 94.5%
a group of people around each other 85.3%
a group of people standing in a room 85.2%

Text analysis

Amazon

sap
KODAK-SVELIA