Human Generated Data

Title

Untitled (man sitting behind desk with papers)

Date

c. 1953

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20130

Human Generated Data

Title

Untitled (man sitting behind desk with papers)

People

Artist: Peter James Studio, American

Date

c. 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20130

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 98.3
Apparel 98.3
Human 96.1
Person 94.4
Coat 85.9
Flooring 82.1
Wood 69.9
Portrait 67.5
Photography 67.5
Face 67.5
Photo 67.5
Finger 58.6
Sleeve 56.9
Icing 56.5
Food 56.5
Dessert 56.5
Cake 56.5
Cream 56.5
Creme 56.5

Clarifai
created on 2023-10-22

people 99.8
adult 98.8
portrait 97.4
monochrome 97
one 96.8
man 96.4
indoors 94
woman 94
medicine 91.8
medical practitioner 89.2
chair 89.2
wear 89.1
room 88.1
furniture 87.1
concentration 86.5
desk 83.9
administration 83
sit 82.8
hospital 82.1
telephone 82

Imagga
created on 2022-03-05

bride 29.7
people 28.4
wedding 26.7
man 25.6
person 23.4
dress 20.8
couple 20
groom 20
sword 19.8
male 19.1
love 18.9
negative 17.4
adult 16.9
happiness 16.4
portrait 16.2
weapon 15.7
professional 15.3
film 14.8
smile 14.2
gown 13.8
men 13.7
picket fence 13.7
married 13.4
happy 11.9
pretty 11.9
women 11.9
bouquet 11.6
businessman 11.5
flowers 11.3
life 11.3
business 10.9
worker 10.9
coat 10.8
suit 10.8
fence 10.7
photographic paper 10.7
celebration 10.4
lifestyle 10.1
fashion 9.8
veil 9.8
lady 9.7
marriage 9.5
smiling 9.4
event 9.2
outdoor 9.2
attractive 9.1
black 9.1
work 8.6
face 8.5
summer 8.4
barrier 8.3
instrument 8.3
holding 8.3
human 8.2
job 8
urban 7.9
wed 7.9
youth 7.7
wife 7.6
lab coat 7.5
house 7.5
city 7.5
park 7.4
office 7.2
home 7.2
photographic equipment 7.1
family 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

indoor 92.6
black and white 91.2
person 90.3
text 89.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-51
Gender Male, 100%
Confused 47%
Surprised 17.7%
Sad 16%
Calm 6.6%
Happy 4.7%
Disgusted 3.2%
Angry 2.5%
Fear 2.3%

Feature analysis

Amazon

Person
Person 94.4%

Categories

Captions

Microsoft
created on 2022-03-05

a man sitting on a table 73.3%
a man sitting in a room 73.2%
a man sitting at a table 73.1%

Text analysis

Amazon

8.0

Google

YT37A2- AO
YT37A2-
AO