Human Generated Data

Title

Untitled (man posing seated on car with violin in lap)

Date

1951

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6397

Human Generated Data

Title

Untitled (man posing seated on car with violin in lap)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6397

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Person 99.6
Human 99.6
Performer 98.6
Musical Instrument 98.6
Leisure Activities 98.6
Musician 98.6
Guitarist 98.6
Guitar 95.5
Shoe 94.7
Clothing 94.7
Footwear 94.7
Apparel 94.7
Transportation 76.8
Car 76.8
Automobile 76.8
Vehicle 76.8
Photography 62.1
Photo 62.1
Face 62.1
Portrait 62.1
Airplane 58.4
Aircraft 58.4

Clarifai
created on 2019-03-22

people 99.8
vehicle 99.2
adult 98.7
two 98.3
man 98.2
transportation system 97.2
one 97.2
car 96.3
woman 95.8
portrait 90.5
group 90
aircraft 88.6
wear 87.6
facial expression 87
three 85.4
actress 85
group together 84.7
actor 83
administration 80.7
indoors 80

Imagga
created on 2019-03-22

man 25.5
person 25
male 24.9
people 22.3
adult 16.9
groom 14.9
happy 13.8
smiling 13
home 12.8
couple 12.2
uniform 12
men 12
house 11.7
senior 11.2
looking 11.2
sitting 11.2
old 11.1
portrait 11
car 10.8
smile 10.7
family 10.7
human 10.5
vehicle 9.5
mature 9.3
chair 9.2
transportation 9
patient 8.8
stretcher 8.8
room 8.6
travel 8.4
conveyance 8.4
vintage 8.3
musical instrument 8.1
grandfather 8
medical 7.9
business 7.9
architecture 7.8
retired 7.8
retirement 7.7
elderly 7.7
husband 7.6
fashion 7.5
care 7.4
litter 7.1
interior 7.1
businessman 7.1
happiness 7.1
indoors 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

person 93.7
outdoor 88.7
black and white 87.8
street 28.7
monochrome 22.4
winter 20

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-63
Gender Male, 62.7%
Confused 6.3%
Angry 24.4%
Sad 24.2%
Calm 28.6%
Disgusted 3%
Surprised 7.5%
Happy 6%

Feature analysis

Amazon

Person 99.6%
Guitar 95.5%
Shoe 94.7%
Car 76.8%
Airplane 58.4%

Categories

Captions

Text analysis

Amazon

TEXAS
TEXAS 52
52
0

Google

EXAS 52
EXAS
52