Human Generated Data

Title

Untitled (men and woman next to long table)

Date

1956

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20277

Human Generated Data

Title

Untitled (men and woman next to long table)

People

Artist: Peter James Studio, American

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Tie 98.4
Accessory 98.4
Accessories 98.4
Human 98
Person 98
Clothing 96.2
Apparel 96.2
Person 94.6
Person 93.3
Person 78.6
Coat 77.7
Clinic 75.5
Suit 64.1
Overcoat 64.1
Sleeve 63.1
Person 62.7
Nurse 55.2

Imagga
created on 2022-03-05

man 39.6
surgeon 37.9
people 35.1
person 31.4
male 31.2
office 28.3
work 23.5
computer 23.5
business 23.1
adult 22.9
laptop 22.9
senior 19.7
working 18.6
room 18.2
teacher 16.8
businessman 16.8
indoors 16.7
men 16.3
portrait 16.2
happy 15.7
education 15.6
modern 15.4
professional 15.4
patient 15
sitting 14.6
center 14.6
group 14.5
executive 14.4
worker 14.2
technology 14.1
smiling 13.7
case 13.7
doctor 13.2
student 13
hand 12.9
corporate 12.9
looking 12.8
home 12.8
human 12.7
desk 12.6
job 12.4
meeting 12.2
classroom 12
board 11.8
medical 11.5
mature 11.2
equipment 11
indoor 11
screen 10.8
team 10.7
table 10.6
class 10.6
teamwork 10.2
occupation 10.1
businesswoman 10
holding 9.9
health 9.7
medicine 9.7
hospital 9.6
couple 9.6
shop 9.2
monitor 8.9
60s 8.8
women 8.7
hands 8.7
lifestyle 8.7
studying 8.6
blackboard 8.5
college 8.5
casual 8.5
success 8
handsome 8
clinic 8
to 8
together 7.9
smile 7.8
face 7.8
black 7.8
teaching 7.8
coat 7.7
retirement 7.7
old 7.7
elderly 7.7
adults 7.6
communication 7.6
horizontal 7.5
one 7.5
care 7.4
camera 7.4
school 7.2
specialist 7.1
happiness 7
barbershop 7
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.4
person 97.6
man 95.6
wall 95.6
clothing 93.5
indoor 85.3
black and white 53.7

Face analysis

Amazon

AWS Rekognition

Age 41-49
Gender Male, 92.4%
Happy 51%
Calm 46.4%
Sad 0.7%
Fear 0.5%
Surprised 0.5%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 45-51
Gender Male, 99.7%
Calm 56.3%
Confused 25%
Sad 12.3%
Angry 2%
Disgusted 2%
Surprised 1.3%
Happy 0.9%
Fear 0.3%

AWS Rekognition

Age 47-53
Gender Male, 98.2%
Calm 97.1%
Sad 1.2%
Surprised 0.5%
Disgusted 0.3%
Happy 0.3%
Confused 0.3%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Male, 68.6%
Calm 88.1%
Surprised 8.2%
Happy 2.7%
Disgusted 0.4%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Sad 0.1%

Feature analysis

Amazon

Tie 98.4%
Person 98%
Suit 64.1%

Captions

Microsoft

a man standing in front of a mirror posing for the camera 71.6%
a man standing in front of a mirror 71.5%
a man that is standing in front of a mirror posing for the camera 62.8%

Text analysis

Amazon

THE
THE JAZZ
JAZZ
NGERS
MESSE
R
EAT
CK
FORM
CO-
IL
ORG
سات
M117--YT37A°- -AX

Google

THE
NGER
THE JAZZ NGER CLL
CLL
JAZZ