Human Generated Data

Title

Untitled (woman holding wrapped boxes)

Date

1949

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19347

Human Generated Data

Title

Untitled (woman holding wrapped boxes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19347

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Musician 97.5
Musical Instrument 97.5
Leisure Activities 85.1
Guitarist 83.5
Performer 83.5
Guitar 83.5
Curtain 66.5
Clothing 63.6
Apparel 63.6
Female 62.6
Overcoat 58.8
Coat 58.8
Wood 56.5

Clarifai
created on 2023-10-22

people 99.7
curtain 98.9
wear 98.5
adult 97.4
theater 97.4
one 97.2
opera 96.7
stage 96.7
music 95.9
man 95
woman 94.3
outfit 94.2
musician 93.6
portrait 92.9
costume 91.7
administration 90.9
comedy 90.8
dress 88.6
singer 88.4
jazz 87.1

Imagga
created on 2022-03-05

musical instrument 96.4
wind instrument 65.4
concertina 64.2
free-reed instrument 52.3
accordion 43
keyboard instrument 33.7
steel drum 33.4
percussion instrument 30.1
man 28.9
people 24
person 23
male 22.7
portrait 22
men 20.6
adult 19
black 16.4
device 16.1
business 14
women 12.6
holding 12.4
lifestyle 12.3
fashion 12
attractive 11.2
pretty 10.5
one 10.4
building 10.3
suit 9.9
modern 9.8
cheerful 9.7
human 9.7
professional 9.7
office 9.6
couple 9.6
sitting 9.4
work 9.4
model 9.3
casual 9.3
city 9.1
alone 9.1
style 8.9
sexy 8.8
urban 8.7
statue 8.7
two 8.5
outdoor 8.4
vintage 8.3
outdoors 8.2
worker 8.1
job 8
smiling 7.9
love 7.9
face 7.8
art 7.8
travel 7.7
old 7.7
happy 7.5
silhouette 7.4
businesswoman 7.3
stylish 7.2
looking 7.2
smile 7.1
interior 7.1
working 7.1
businessman 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

curtain 98.6
person 91.6
black 73.1
clothing 71.6
black and white 69.5
human face 53.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 99.7%
Calm 66.1%
Happy 12%
Surprised 11.4%
Fear 3.8%
Confused 2.3%
Sad 2.1%
Disgusted 1.6%
Angry 0.8%

Feature analysis

Amazon

Person
Person 99%

Categories

Text analysis

Amazon

S8
YT3RA2-XAGOX

Google

S8 YT3RA2-XA
S8
YT3RA2-XA