Human Generated Data

Title

Untitled (interior view of employees sitting at their desks inside shoe company)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9429

Human Generated Data

Title

Untitled (interior view of employees sitting at their desks inside shoe company)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9429

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Restaurant 99.4
Chair 98.6
Furniture 98.6
Person 96.3
Person 95.7
Person 95.6
Cafeteria 93.7
Person 91.2
Person 90.8
Person 85.3
Person 81.8
Workshop 80.7
Chair 78.2
Indoors 77.3
Person 72.6
Building 70.4
Tabletop 67.8
Room 65.8
Table 61.1
Meal 58.4
Food 58.4
Cafe 57.7
Lab 56.9

Clarifai
created on 2023-10-27

room 99
people 98.8
furniture 98.7
indoors 98.1
table 97.6
desk 97.2
chair 96.5
employee 95.8
man 95.2
adult 95
restaurant 92.7
woman 92.5
seat 91.8
grinder 91.3
group 90.7
industry 90.1
inside 88.5
food 86.4
one 84.2
monochrome 84.1

Imagga
created on 2022-01-23

barbershop 82.7
shop 77.9
mercantile establishment 55.8
interior 49.5
chair 38.2
counter 37.8
place of business 37.3
room 36.5
furniture 29.9
modern 28.7
table 28.1
indoors 28.1
kitchen 28
home 24.7
office 24.5
house 21.7
salon 20.4
establishment 19.5
indoor 19.2
design 19.1
people 19
work 18.8
decor 18.6
business 17.6
inside 17.5
seat 17.2
man 16.1
wood 15.8
floor 15.8
glass 14.8
window 14.7
apartment 14.4
desk 14.1
stove 14.1
light 14
luxury 13.7
male 13.5
mirror 13.3
chairs 12.7
architecture 12.5
lamp 12.4
restaurant 12.2
food 12.1
residential 11.5
comfortable 11.5
computer 11.2
person 11.1
occupation 11
elegance 10.9
hairdresser 10.7
professional 10.7
check 10.6
working 10.6
lifestyle 10.1
adult 10
cabinet 9.9
medical 9.7
style 9.6
cooking 9.6
3d 9.3
clinic 9.2
equipment 9.2
clean 9.2
cook 9.1
domestic 9
center 8.9
job 8.8
wooden 8.8
life 8.7
urban 8.7
lighting 8.7
exam 8.6
empty 8.6
men 8.6
wall 8.6
smile 8.5
shoe shop 8.5
contemporary 8.5
portrait 8.4
health 8.3
bar 8.3
happy 8.1
building 8.1
steel 8
women 7.9
oven 7.9
barber chair 7.8
sink 7.8
nobody 7.8
hospital 7.7
dining 7.6
relax 7.6
dinner 7.6
fashion 7.5
group 7.3
success 7.2
decoration 7.2
classroom 7.2
medicine 7
patient 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

furniture 96.1
text 95.8
table 94.1
indoor 89.3
black and white 84.4
computer 75.7
desk 74.9
chair 74.3
white 66.2
cluttered 26.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 98%
Sad 57.3%
Disgusted 12.2%
Calm 11.6%
Confused 11.6%
Surprised 2.8%
Angry 2.4%
Happy 1.3%
Fear 0.7%

AWS Rekognition

Age 26-36
Gender Male, 99.4%
Happy 34.4%
Calm 25.7%
Sad 18.9%
Fear 8.6%
Angry 5%
Surprised 2.9%
Confused 2.7%
Disgusted 1.7%

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Calm 89.9%
Sad 4.6%
Fear 3.1%
Confused 0.7%
Angry 0.5%
Happy 0.5%
Surprised 0.4%
Disgusted 0.2%

AWS Rekognition

Age 23-33
Gender Female, 80.5%
Calm 95.4%
Sad 2.2%
Happy 1.2%
Confused 0.4%
Disgusted 0.2%
Fear 0.2%
Surprised 0.2%
Angry 0.2%

AWS Rekognition

Age 34-42
Gender Male, 95.9%
Calm 74.7%
Happy 10%
Angry 4.6%
Sad 3.8%
Disgusted 2.2%
Surprised 2.2%
Confused 1.9%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.6%
Person 96.3%
Person 95.7%
Person 95.6%
Person 91.2%
Person 90.8%
Person 85.3%
Person 81.8%
Person 72.6%
Chair 98.6%
Chair 78.2%

Captions

Text analysis

Amazon

AD
weens
YТ3AX