Human Generated Data

Title

Untitled (employees lined up to receive Christmas turkey)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19372

Human Generated Data

Title

Untitled (employees lined up to receive Christmas turkey)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19372

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.5
Person 99.4
Person 98.7
Person 98.6
Person 96.4
Clothing 94.3
Apparel 94.3
Clinic 88
Person 78.7
Coat 74.1
Chair 68.3
Furniture 68.3
Nurse 66.8
Hospital 66.5
Person 65.9
Person 60
Shop 57.2
Lab 55.9
Lab Coat 55.1

Clarifai
created on 2023-10-22

people 99.7
group 98.3
adult 98.2
woman 96.8
man 95.9
group together 95.4
medical practitioner 92.9
monochrome 91.2
education 89.6
indoors 88.8
scientist 88.1
room 86.1
hospital 85.2
three 85.1
medicine 84.5
uniform 83.3
five 81.7
administration 81.5
several 81.4
outerwear 81.3

Imagga
created on 2022-03-05

barbershop 41.9
shop 41.7
counter 31
room 30.4
man 30.2
people 29.6
interior 29.2
business 26.7
mercantile establishment 26.6
office 26.1
table 25.3
restaurant 24.7
turnstile 24.7
modern 24.5
gate 23.2
person 22.6
male 22
adult 19.9
men 19.7
chair 19.5
businessman 19.4
work 18.8
indoors 18.4
women 18.2
place of business 17.8
classroom 17.6
happy 17.5
smiling 16.6
building 16.5
indoor 16.4
corporate 16.3
professional 16.2
group 16.1
movable barrier 15.6
sitting 15.5
cafeteria 15
furniture 14.7
meeting 14.1
couple 13.9
communication 13.4
lifestyle 13
executive 12.9
inside 12.9
computer 12.8
two 12.7
team 12.5
teacher 12.2
teamwork 12.1
light 12
businesswoman 11.8
job 11.5
hospital 11.4
together 11.4
standing 11.3
education 11.3
home 11.2
suit 10.8
smile 10.7
decor 10.6
working 10.6
desk 10.5
barrier 10.5
design 10.1
structure 9.8
conference 9.8
worker 9.4
casual 9.3
mature 9.3
laptop 9.1
establishment 8.9
seat 8.7
architecture 8.6
center 8.6
dining 8.6
talking 8.6
nurse 8.5
portrait 8.4
floor 8.4
window 8.4
service 8.3
occupation 8.2
board 8.1
looking 8
equipment 8
medical 7.9
employee 7.9
hall 7.9
space 7.8
3d 7.7
wall 7.7
waiter 7.7
comfortable 7.6
businesspeople 7.6
dinner 7.6
horizontal 7.5
life 7.5
house 7.5
manager 7.4
holding 7.4
technology 7.4
20s 7.3
food 7.2
success 7.2
decoration 7.2
day 7.1

Google
created on 2022-03-05

Suit 77.6
White-collar worker 70.9
Monochrome 70.5
Event 70.4
Monochrome photography 69.9
Room 66.4
Chair 62.1
History 59.2
Job 58.6
Service 56.9
Uniform 56.7
Employment 56.4
Collaboration 55.1
Machine 54.7
Desk 51.9
Font 50.5
Sitting 50.4

Microsoft
created on 2022-03-05

person 94.9
furniture 88.6
text 86.1
table 85.4
clothing 78.6
chair 60.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.8%
Calm 36.6%
Confused 30.3%
Sad 21.1%
Angry 5.5%
Disgusted 2.5%
Happy 2.1%
Surprised 1.2%
Fear 0.7%

AWS Rekognition

Age 52-60
Gender Male, 95.7%
Calm 92.3%
Sad 3%
Confused 2.2%
Happy 1.2%
Surprised 0.7%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Female, 88.1%
Calm 90.5%
Sad 7.4%
Happy 0.9%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 51-59
Gender Male, 93.9%
Sad 40.5%
Happy 26.5%
Calm 12.8%
Surprised 8.8%
Confused 7.2%
Disgusted 1.7%
Angry 1.2%
Fear 1.2%

AWS Rekognition

Age 52-60
Gender Female, 97.5%
Calm 94.2%
Happy 3.6%
Sad 1%
Fear 0.4%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 27-37
Gender Male, 84.1%
Calm 99.3%
Sad 0.3%
Happy 0.1%
Confused 0.1%
Fear 0.1%
Angry 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 52-60
Gender Male, 99.8%
Calm 98.5%
Happy 1%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.6%
Person 99.5%
Person 99.4%
Person 98.7%
Person 98.6%
Person 96.4%
Person 78.7%
Person 65.9%
Person 60%
Chair 68.3%

Text analysis

Amazon

MERRY
NEW
NEW YEAR
YEAR
PROSPEROUS
MERRY DUS
TO
DUS
143A
KODVK-SLA

Google

YT33A2- AO
YT33A2-
AO