Human Generated Data

Title

Untitled (man behind desk)

Date

1955

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20093

Human Generated Data

Title

Untitled (man behind desk)

People

Artist: Peter James Studio, American

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Person 94.8
Home Decor 92.6
Food 84.6
Meal 84.6
Dish 76.4
Sitting 75.6
Furniture 71.5
Table 64.3
Text 63.7
Window 61.2
Restaurant 60.3
Apparel 57.9
Clothing 57.9
Finger 57.6
Cafeteria 56.7

Imagga
created on 2022-03-05

waiter 54.2
man 43.7
dining-room attendant 41.9
employee 35.2
male 34.8
restaurant 33.5
worker 32.2
adult 30.6
person 29.1
people 26.8
happy 25.7
indoors 25.5
office 24.6
working 23.9
senior 23.4
home 22.3
business 21.9
building 20.7
sitting 20.6
smiling 20.3
computer 20.1
work 18.8
businessman 18.6
job 17.7
lifestyle 17.4
kitchen 16.4
businesswoman 15.5
desk 15.3
meeting 15.1
professional 14.8
laptop 14.7
elderly 14.4
cafeteria 14.3
smile 14.3
seller 14.1
table 14
couple 13.9
structure 13.4
mature 13
cheerful 13
food 13
looking 12.8
40s 12.7
technology 12.6
day 12.6
team 12.6
adults 12.3
color 12.2
clothing 12
men 12
camera 12
casual 11.9
workplace 11.4
together 11.4
teamwork 11.1
happiness 11
holding 10.7
talking 10.5
businesspeople 10.4
two 10.2
portrait 9.7
mid adult 9.7
daytime 9.6
30s 9.6
cooking 9.6
education 9.5
glasses 9.3
occupation 9.2
indoor 9.1
attractive 9.1
meal 9.1
student 9.1
suit 9
group 8.9
success 8.9
casual clothing 8.8
chef 8.7
retirement 8.6
corporate 8.6
inside 8.3
handsome 8
glass 7.9
face 7.8
shop 7.8
retired 7.8
1 7.7
executive 7.7
class 7.7
pretty 7.7
monitor 7.7
old 7.7
room 7.6
counter 7.6
horizontal 7.5
one person 7.5
clothes 7.5
breakfast 7.4
coffee 7.4
classroom 7.4
20s 7.3
dinner 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 98.6
man 96.9
window 96.9
text 95.9
black and white 91.7
clothing 87.9
dish 78.2
sushi 61.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 67.6%
Calm 94.3%
Confused 1.6%
Sad 1.3%
Surprised 1.1%
Disgusted 0.8%
Angry 0.4%
Happy 0.4%
Fear 0.2%

AWS Rekognition

Age 41-49
Gender Male, 59.5%
Calm 96.1%
Sad 1.9%
Surprised 0.4%
Angry 0.4%
Confused 0.4%
Fear 0.3%
Disgusted 0.3%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man sitting in front of a window 90.5%
a man sitting at a table in front of a window 90.4%
a man is sitting in front of a window 84.9%

Text analysis

Amazon

PA
..
الله
CHIP

Google

8 O NI teRANDOON ONNE
NI
O
teRANDOON
8
ONNE