Human Generated Data

Title

Untitled (copy negative of portrait of western musical group posing with instruments)

Date

1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4209

Human Generated Data

Title

Untitled (copy negative of portrait of western musical group posing with instruments)

People

Artist: Durette Studio, American 20th century

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4209

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 98.9
Person 97.9
Leisure Activities 97.8
Musical Instrument 97.8
Guitar 97.8
Person 97.4
Person 97.4
Person 96
Art 91.5
Drawing 91.5
Musician 90
Sketch 79.4
Person 78.3
Portrait 62.7
Photography 62.7
Face 62.7
Photo 62.7
Jaw 59.9
Music Band 59.5
Performer 59.1
Banjo 56.7
Person 56.4
Guitar 52.2

Clarifai
created on 2019-06-01

people 99.8
adult 98.5
group 97
man 96
woman 95.4
illustration 93
monochrome 91.5
music 89.2
wear 86.5
veil 84.2
musician 81.8
chair 81.8
many 80.1
furniture 79.6
art 79.1
leader 79
administration 78.1
vintage 77.8
indoors 76.8
portrait 76.6

Imagga
created on 2019-06-01

sketch 100
drawing 89.2
representation 62.3
currency 19.7
money 19.6
cash 18.3
art 17.2
business 17
design 16.9
banking 16.5
paper 16.5
old 16
dollar 15.8
bank 14.9
vintage 14.1
grunge 13.6
finance 12.7
dollars 12.5
financial 12.5
bill 12.4
pattern 12.3
retro 12.3
symbol 12.1
us 11.6
economy 11.1
black 10.8
wealth 10.8
hundred 10.6
decoration 10.2
clip art 10.2
banknotes 9.8
exchange 9.5
architecture 9.4
savings 9.3
rich 9.3
investment 9.2
history 8.9
element 8.3
style 8.2
antique 7.9
people 7.8
negative 7.7
ink 7.7
payment 7.7
profit 7.6
house 7.5
silhouette 7.4
paint 7.2
market 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 99.8
window 90.1
person 82.1
musical instrument 71.9
sketch 69.3
clothing 69
drawing 66.6
posing 61.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Calm 45.7%
Disgusted 45.3%
Confused 45.4%
Surprised 45.6%
Happy 49.1%
Sad 48%
Angry 46%

AWS Rekognition

Age 30-47
Gender Male, 54.1%
Angry 45.6%
Happy 45.3%
Sad 45.3%
Disgusted 45.1%
Confused 45.3%
Calm 53.2%
Surprised 45.3%

AWS Rekognition

Age 15-25
Gender Female, 50.7%
Confused 45.8%
Angry 45.2%
Sad 46.2%
Calm 52.1%
Disgusted 45.1%
Happy 45.2%
Surprised 45.5%

AWS Rekognition

Age 26-43
Gender Male, 52.2%
Disgusted 45.2%
Calm 45.2%
Surprised 45.2%
Angry 45.2%
Happy 52%
Sad 47%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.5%
Calm 49.4%
Surprised 45.7%
Disgusted 45.3%
Happy 46.8%
Sad 46.2%
Confused 45.3%
Angry 46.3%

AWS Rekognition

Age 29-45
Gender Male, 54.8%
Angry 45.6%
Happy 46.3%
Confused 45.6%
Calm 45.7%
Surprised 45.2%
Sad 51.5%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Happy 46.3%
Sad 45.6%
Surprised 45.2%
Confused 45.3%
Disgusted 45.1%
Calm 52.2%
Angry 45.3%

Feature analysis

Amazon

Person 99.5%
Guitar 97.8%

Categories

Imagga

paintings art 98.2%
text visuals 1.5%

Text analysis

Amazon

RAMBLING
his
TEX
1940
VAGABONDIS
TEX and his
and

Google

TEX and his RAMBLING VAGABOND S 1940
TEX
and
his
RAMBLING
VAGABOND
S
1940