Human Generated Data

Title

Untitled (man looking through files)

Date

1957

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20099

Human Generated Data

Title

Untitled (man looking through files)

People

Artist: Peter James Studio, American

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20099

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Clothing 71.8
Apparel 71.8
Furniture 68
Photography 57.1
Photo 57.1
Door 56.8
Suit 56
Coat 56
Overcoat 56

Clarifai
created on 2023-10-22

people 99.4
adult 98.8
monochrome 98.7
one 98.4
man 97.4
indoors 96.7
two 94.3
furniture 93.8
room 90.5
woman 90.4
wear 90
administration 88.6
technology 83.4
scientist 82.6
medical practitioner 82.3
three 81.4
sit 79.9
veil 79.3
science 78.8
desk 77.5

Imagga
created on 2022-03-05

electronic instrument 55.6
musical instrument 48.4
device 34
man 27.5
chair 27.5
person 27.4
male 26.2
people 23.4
business 23.1
office 21.8
room 21.6
businessman 21.2
computer 19.7
barber chair 19.4
adult 18.8
indoors 17.6
smiling 15.9
laptop 15.7
seat 15.4
furniture 15.2
home 15.1
interior 15
working 14.1
table 14.1
corporate 13.7
men 13.7
lifestyle 13.7
work 13.5
holding 13.2
briefcase 13.1
modern 12.6
job 12.4
smile 12.1
suit 12.1
happy 11.3
senior 11.2
professional 11.2
indoor 10.9
house 10.9
looking 10.4
desk 9.8
one 9.7
standing 9.6
barbershop 9.3
casual 9.3
shop 9.3
monitor 9.2
worker 9.1
black 9
technology 8.9
light 8.7
executive 8.6
sitting 8.6
screen 8.4
portrait 8.4
pretty 8.4
manager 8.4
mature 8.4
old 8.4
fashion 8.3
window 8.2
alone 8.2
confident 8.2
cheerful 8.1
attractive 7.7
elderly 7.7
sit 7.6
phone 7.4
back 7.3
businesswoman 7.3
handsome 7.1
information 7.1
architecture 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.2
piano 91.7
person 90.1
black and white 85.2
clothing 67.1
musical instrument 60.7
laptop 53.7
computer 34.1
desk 5.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 96.9%
Calm 94.3%
Angry 1.7%
Confused 1.5%
Sad 1.4%
Happy 0.4%
Disgusted 0.4%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%

Text analysis

Amazon

FILM
KODAK
SAFETY
8
MANHATTAN

Google

KODAK SAFETY FILM 8
KODAK
SAFETY
FILM
8