Human Generated Data

Title

Untitled (men with bobcats)

Date

1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18196

Human Generated Data

Title

Untitled (men with bobcats)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18196

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Person 99.3
Clothing 96.1
Apparel 96.1
Face 70.7
Chair 66.8
Furniture 66.8
Shorts 66.4
Pants 60.2
Long Sleeve 57.9
Sleeve 57.9
Door 57.4
Wood 56.1

Clarifai
created on 2023-10-22

people 99.8
two 99.1
man 95.9
adult 95.6
group together 94.9
three 94.6
group 94
four 91
wear 89.1
woman 86
monochrome 84.5
administration 80.8
several 80.7
elderly 80.3
actor 79.6
street 79.2
music 78.8
room 75.5
musician 74.9
retro 74.7

Imagga
created on 2022-03-04

man 30.9
person 29.6
computer 29.3
people 27.9
male 25.5
adult 24.3
office 23.6
disk jockey 22.3
laptop 21.3
business 21.2
work 20.4
room 19.2
keyboard 18
broadcaster 17.9
indoors 16.7
men 16.3
working 15.9
worker 15.5
happy 14.4
communicator 14.1
corporate 13.7
hand 13.7
businessman 13.2
lifestyle 13
device 12.8
women 12.6
black 12.6
attractive 12.6
sitting 12
businesswoman 11.8
portrait 11.6
job 11.5
table 11.4
couple 11.3
equipment 11.2
senior 11.2
modern 11.2
home 11.2
old 11.1
face 10.6
professional 10.5
human 10.5
one 10.4
looking 10.4
technology 10.4
shop 10.1
monitor 10
holding 9.9
interior 9.7
musical instrument 9.6
desk 9.5
love 9.5
smiling 9.4
light 9.4
mature 9.3
indoor 9.1
teacher 8.8
busy 8.7
suit 8.5
finance 8.4
screen 8.4
notebook 8.4
machine 8.3
fashion 8.3
alone 8.2
percussion instrument 7.7
pretty 7.7
casual 7.6
two 7.6
career 7.6
house 7.5
occupation 7.3
lady 7.3
executive 7.3
dress 7.2
smile 7.1
clothing 7
together 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 97.5
person 97.4
clothing 95.4
man 82.6
standing 78.5
footwear 78.2
black and white 74.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 81.9%
Calm 50.8%
Happy 23%
Sad 19.5%
Confused 2.3%
Surprised 2%
Disgusted 1%
Angry 0.7%
Fear 0.7%

AWS Rekognition

Age 40-48
Gender Male, 88.3%
Surprised 36.7%
Calm 32.6%
Sad 10.9%
Confused 6.7%
Disgusted 5.4%
Angry 4%
Happy 3%
Fear 0.6%

AWS Rekognition

Age 22-30
Gender Female, 86.4%
Calm 34.4%
Happy 30%
Sad 20.4%
Fear 9.5%
Disgusted 2.1%
Confused 1.8%
Angry 1%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.3%

Text analysis

Amazon

PRODIGAL
FOX
642
MJI7--YT37AS--AO

Google

MATE MJI7--YT3RA°2--XAGOX
MATE
MJI7--YT3RA°2--XAGOX