Human Generated Data

Title

Untitled (men working on printing presses)

Date

1949

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20240

Human Generated Data

Title

Untitled (men working on printing presses)

People

Artist: Peter James Studio, American

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20240

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 94.9
Building 92.7
Factory 87.3
Machine 73.7
Sitting 63.1
Clothing 61.8
Apparel 61.8
Interior Design 58.2
Indoors 58.2
LCD Screen 55.8
Electronics 55.8
Screen 55.8
Monitor 55.8
Display 55.8

Clarifai
created on 2023-10-22

people 99.3
adult 99.1
machine 98.5
two 97.9
one 97
man 96.7
industry 95.9
machinery 95.5
room 93.2
equipment 91
technology 90.6
grinder 89.8
work 88.7
woman 88.5
wear 87.3
actor 86.3
three 86.2
elderly 85.8
production 84.9
group 84.9

Imagga
created on 2022-03-05

locker 44.1
device 42.3
fastener 34.7
machine 34.2
people 26.2
restraint 26
man 24.8
cash machine 22.5
work 22
equipment 21.4
male 21.3
indoors 21.1
adult 20.1
person 19.9
home 18.3
working 17.7
worker 17
happy 16.3
room 15.3
interior 15
job 14.1
business 14
smiling 13.7
lifestyle 13.7
computer 13.1
industry 12.8
smile 12.1
office 11.2
occupation 11
portrait 10.3
men 10.3
hospital 10.2
20s 10.1
attractive 9.8
cheerful 9.7
home appliance 9.6
skill 9.6
kitchen 9.3
clothing 9.2
face 9.2
inside 9.2
house 9.2
industrial 9.1
technology 8.9
happiness 8.6
professional 8.6
clothes 8.4
appliance 8.3
health 8.3
human 8.2
indoor 8.2
one 8.2
looking 8
to 8
camera 7.9
education 7.8
factory 7.7
pretty 7.7
casual 7.6
apparatus 7.5
building 7.4
toaster 7.4
lady 7.3
kitchen appliance 7.2
women 7.1
patient 7.1
businessman 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.2
appliance 91.7
black and white 90.5
person 87.2
window 81.9
clothing 73.3
monochrome 59.6
white goods 59.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Female, 83.7%
Calm 85.6%
Fear 9.7%
Happy 1.7%
Surprised 1.2%
Angry 0.6%
Sad 0.5%
Disgusted 0.4%
Confused 0.3%

Feature analysis

Amazon

Person
Person 99.6%
Person 94.9%

Categories

Imagga

paintings art 97.5%
people portraits 1.5%

Text analysis

Amazon

MUS
MUS VITTAL
VITTAL

Google

VI10 0324
VI10
0324