Human Generated Data

Title

Untitled (Christmas picture, nurse and women in scrubs, in front of fireplace)

Date

c.1970, from earlier negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18593

Human Generated Data

Title

Untitled (Christmas picture, nurse and women in scrubs, in front of fireplace)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, from earlier negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18593

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Chair 98.9
Furniture 98.9
Clothing 98.1
Apparel 98.1
Person 97.9
Person 94.5
Home Decor 91.4
Person 84.6
Couch 82
Flooring 78.3
Floor 76.4
Indoors 66
Living Room 64.5
Room 64.5
Hat 63.6
Building 62.2
Sitting 60.1
Art 59.7
Drawing 57.8
Housing 57.3
Shorts 56.4
Silhouette 55.9

Clarifai
created on 2023-10-22

people 99.9
adult 98.9
furniture 98.3
two 98.1
monochrome 97.2
woman 95.7
administration 95.7
wear 95.6
newspaper 95.6
man 95.5
group 93.6
leader 92.5
one 92
sit 92
three 90.7
group together 89.2
seat 88.6
child 88.5
home 87.4
chair 87.4

Imagga
created on 2022-03-05

crutch 34.6
staff 28
painter 25.9
man 24.2
people 21.2
stick 20.3
chair 19.7
person 16.8
room 15.7
passenger 14.6
male 14.2
business 14
city 13.3
old 13.2
building 12.9
work 11.8
adult 11.3
life 10.9
working 10.6
table 10.4
men 10.3
black 10.2
seat 10
travel 9.9
hospital 9.6
walking 9.5
lifestyle 9.4
outdoors 9.1
office 9
lady 8.9
interior 8.8
businessman 8.8
patient 8.8
urban 8.7
architecture 8.6
sitting 8.6
tree 8.5
portrait 8.4
window 8.4
silhouette 8.3
mask 8
day 7.8
modern 7.7
walk 7.6
leisure 7.5
street 7.4
light 7.4
back 7.3
suit 7.2
home 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.2
black and white 93.4
street 91.3
outdoor 90.4
clothing 86.7
person 84.8
monochrome 77.4
furniture 67.1
drawing 60.9
man 58.4
footwear 56.7
chair 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 66.5%
Calm 99.1%
Sad 0.5%
Fear 0.1%
Happy 0.1%
Disgusted 0.1%
Confused 0%
Angry 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.4%
Person 97.9%
Person 94.5%
Person 84.6%
Chair 98.9%

Categories

Text analysis

Google

17--YT33A°2--XAGO
17--YT33A°2--XAGO