Human Generated Data

Title

Untitled (two men with dead bobcats in kitchen)

Date

1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18195

Human Generated Data

Title

Untitled (two men with dead bobcats in kitchen)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18195

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.7
Human 99.7
Person 99.4
Clothing 93.7
Apparel 93.7
Person 92
Face 81.9
Shoe 81.4
Footwear 81.4
Photography 60.6
Photo 60.6
Shoe 60.1
Wood 59.2
Pants 58.3
Leisure Activities 57.9
Floor 56.1
Flooring 55.9

Clarifai
created on 2023-10-22

people 99.9
two 98.5
adult 97.2
group together 97.2
group 96.7
man 96.2
three 94.8
four 92
street 90.7
woman 90.4
music 89.8
wear 89.4
several 88.1
monochrome 87.3
musician 83.3
actor 83
administration 82.4
many 82.3
singer 81.4
elderly 80.3

Imagga
created on 2022-03-04

man 34.3
telephone 33.1
pay-phone 30.6
call 29.7
male 28.4
people 24
person 22.3
electronic equipment 22.1
business 21.9
equipment 21.5
adult 19
room 18.3
office 17.8
working 15
businessman 15
portrait 14.9
indoors 14
happy 13.8
fashion 13.6
one 13.4
work 13.3
black 13.3
men 12
machine 11.9
old 11.8
worker 11.7
device 11.7
lifestyle 11.6
smile 11.4
suit 11.3
corporate 11.2
sitting 11.2
hand 10.6
couple 10.4
attractive 9.8
human 9.7
smiling 9.4
two 9.3
dark 9.2
city 9.1
computer 9.1
holding 9.1
professional 9
shop 8.9
interior 8.8
home 8.8
cash machine 8.7
inside 8.3
back 8.3
businesswoman 8.2
style 8.2
urban 7.9
face 7.8
executive 7.5
street 7.4
job 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.2
person 99
clothing 96.1
man 90.2
standing 83.5
posing 71.7
footwear 59.8
black and white 54.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 98.3%
Calm 91.3%
Sad 2.9%
Confused 2.8%
Disgusted 0.8%
Surprised 0.7%
Happy 0.7%
Angry 0.6%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Female, 57.6%
Calm 76.2%
Sad 10.7%
Happy 10.7%
Confused 0.7%
Fear 0.5%
Surprised 0.4%
Disgusted 0.4%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 99.4%
Person 92%
Shoe 81.4%
Shoe 60.1%

Categories

Text analysis

Amazon

FOX
PRODIGAL
THE
Research
6-02
Research passport
passport

Google

MJI7--YT3RA°2-->AGO AINW EGE
MJI7--YT3RA°2-->AGO
AINW
EGE