Human Generated Data

Title

Untitled (men and women in office)

Date

1939

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21910

Human Generated Data

Title

Untitled (men and women in office)

People

Artist: Hamblin Studio, American active 1930s

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21910

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.5
Human 99.5
Person 99.2
Person 99.1
Person 98.3
Workshop 90.5
Building 85.6
Person 83.8
Factory 76.3
Furniture 76.2
Indoors 73.3
Machine 70.2
Room 68.4
Monitor 68
Electronics 68
Display 68
Screen 68
Clinic 66
Table 61.9
Housing 61.3
Person 61
Lab 59.2
People 55.9

Clarifai
created on 2023-10-22

people 99.9
group 99.4
many 99.2
adult 98.9
group together 98.8
monochrome 98.1
man 97.3
furniture 96.1
woman 96.1
administration 95.9
several 95.9
war 92.9
room 92.1
two 91.7
wear 91.6
three 91.6
one 90.1
vehicle 89.1
four 88
five 87.8

Imagga
created on 2022-03-11

barbershop 100
shop 100
mercantile establishment 94.2
place of business 62.9
establishment 31.4
interior 24.8
people 23.4
man 22.2
room 21.1
chair 17.2
home 16.7
house 16.7
indoors 15.8
work 15.7
counter 15
building 14.6
lifestyle 14.4
modern 14
table 13.8
inside 13.8
adult 13.6
male 13.5
business 13.4
person 13.2
men 12.9
industry 12.8
indoor 12.8
kitchen 12.5
architecture 12.5
city 12.5
light 12
furniture 12
office 11.3
old 11.1
restaurant 11.1
working 9.7
food 9.7
sitting 9.4
wood 9.2
technology 8.9
job 8.8
computer 8.8
machine 8.7
professional 8.5
window 8.2
worker 8.2
industrial 8.2
happy 8.1
equipment 8.1
center 8
decoration 8
smiling 8
women 7.9
love 7.9
design 7.9
glass 7.8
construction 7.7
senior 7.5
style 7.4
aged 7.2
color 7.2
cut 7.2
smile 7.1
decor 7.1
steel 7.1
travel 7

Google
created on 2022-03-11

Window 90.1
Black-and-white 83.4
Art 81.8
Table 78
Hat 77.1
Machine 74.2
Monochrome photography 71.3
Monochrome 71
Vintage clothing 70.5
Room 68.1
Factory 66.6
Service 64.5
Job 62.8
Employment 61.9
Toolroom 61.6
History 61.3
Building 57.8
Office equipment 55.6
Desk 54.5
Illustration 54.1

Microsoft
created on 2022-03-11

kitchen 93.6
text 88.9
person 75.1
black and white 60.3
clothing 58
preparing 56.1
house 52.5
cluttered 19.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.5%
Sad 84.4%
Calm 8.6%
Confused 4.9%
Surprised 0.6%
Disgusted 0.5%
Angry 0.4%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 37-45
Gender Male, 99.5%
Calm 83%
Sad 9.3%
Confused 2.6%
Disgusted 2.2%
Happy 1%
Angry 0.8%
Surprised 0.6%
Fear 0.4%

AWS Rekognition

Age 33-41
Gender Male, 98.8%
Sad 96.4%
Confused 1.4%
Calm 1%
Happy 0.5%
Disgusted 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 96.6%
Calm 93.2%
Happy 1.6%
Angry 1.4%
Surprised 1.3%
Sad 1%
Confused 1%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 43-51
Gender Male, 97.9%
Calm 79.2%
Sad 7.5%
Confused 6.7%
Angry 3.4%
Disgusted 1.9%
Happy 0.5%
Surprised 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.2%
Person 99.1%
Person 98.3%
Person 83.8%
Person 61%

Categories

Text analysis

Amazon

7
MJ17YY33A2
MJ17YY33A2 ARDA
ARDA

Google

VEEV 2VEEIA EITH
VEEV
2VEEIA
EITH