Human Generated Data

Title

Untitled (women working at tables at H.E. Harris Stamp Co.)

Date

July 7, 1954

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18105

Human Generated Data

Title

Untitled (women working at tables at H.E. Harris Stamp Co.)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

July 7, 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18105

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Assembly Line 100
Factory 100
Building 100
Person 99
Human 99
Person 98.6
Person 98.1
Person 96.3
Person 94.9
Person 93.8
Person 92
Person 85.8
Manufacturing 75.4
Person 62.5
Cafeteria 58.6
Restaurant 58.6
Person 53.7

Clarifai
created on 2023-10-22

people 99.6
furniture 98.9
many 98.6
group together 98.4
group 97.3
room 95.5
monochrome 94.7
adult 93
man 91.7
woman 91.2
department store 89.7
desk 88.9
vehicle 88.3
employee 87.8
several 86.4
transportation system 85.7
commerce 84.6
war 83
indoors 82.6
airport 81.8

Imagga
created on 2022-03-04

shop 65.4
interior 50.4
mercantile establishment 43.7
room 39.1
chair 32.8
shoe shop 31.2
table 30.9
furniture 29.7
place of business 29.6
restaurant 29.4
modern 28.7
indoors 26.4
office 21.7
building 20.4
bakery 20.2
house 20.1
case 20
inside 19.3
indoor 19.2
business 18.2
design 18
decor 17.7
floor 17.7
home 17.5
light 16.7
structure 16.6
work 16.5
luxury 16.3
people 16.2
man 15.5
lifestyle 15.2
establishment 14.6
glass 14
3d 13.9
salon 13.4
wood 13.3
architecture 13.3
window 13.1
decoration 13
barbershop 13
counter 13
men 12.9
women 12.6
sofa 12.4
equipment 12.4
cafeteria 12
food 11.5
apartment 11.5
steel 11.5
lamp 11.4
dining 11.4
industry 11.1
kitchen 11
person 10.6
comfortable 10.5
urban 10.5
group 10.5
living 10.4
computer 10.4
style 10.4
empty 10.3
wall 10.3
bar 10.2
elegance 10.1
male 9.9
hotel 9.5
store 9.4
buy 9.4
nobody 9.3
casual 9.3
space 9.3
adult 9.1
stylish 9
plant 9
tables 8.9
working 8.8
businessman 8.8
residential 8.6
life 8.6
seat 8.5
meeting 8.5
contemporary 8.5
dinner 8.4
fashion 8.3
center 8.3
happy 8.1
classroom 8
smiling 8
education 7.8
sitting 7.7
retail 7.6
drink 7.5
library 7.4
teamwork 7.4
service 7.4
shopping 7.3
occupation 7.3
clinic 7.3
businesswoman 7.3
wooden 7

Google
created on 2022-03-04

Furniture 94.3
Table 92.1
Window 91.2
Building 85.2
Black-and-white 85
Art 78.9
Monochrome 77
Monochrome photography 76.8
Machine 73.7
Factory 69.9
Glass 67.3
Engineering 66.7
Room 65.8
Desk 64.3
Toolroom 60.9
Metal 60.6
Science 60.2
Wood 57
Weaving 52.6
Steel 51.7

Microsoft
created on 2022-03-04

indoor 96.9
window 94.6
room 81
black and white 71.5
person 61.3
text 58.4
building 55.8
table 36.5
furniture 26.5
dining table 6.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 61.6%
Calm 99.8%
Happy 0.1%
Surprised 0%
Sad 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 54-64
Gender Female, 81.6%
Calm 99.4%
Surprised 0.4%
Sad 0.1%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 39-47
Gender Female, 53.7%
Calm 62.6%
Sad 28.8%
Confused 3.7%
Happy 1.5%
Disgusted 1.1%
Angry 1%
Surprised 0.8%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99%
Person 98.6%
Person 98.1%
Person 96.3%
Person 94.9%
Person 93.8%
Person 92%
Person 85.8%
Person 62.5%
Person 53.7%

Categories

Imagga

interior objects 99%

Text analysis

Amazon

Je