Human Generated Data

Title

Untitled (young men in tuxedos next to car)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17335

Human Generated Data

Title

Untitled (young men in tuxedos next to car)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17335

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Clothing 99.1
Apparel 99.1
Car 98.9
Automobile 98.9
Transportation 98.9
Vehicle 98.9
Person 98.7
Person 97.6
Person 96.9
Suit 92.8
Overcoat 92.8
Coat 92.8
Shoe 88.2
Footwear 88.2
Wheel 84.4
Machine 84.4
Tire 84.2
Building 75.6
Tie 72.7
Accessories 72.7
Accessory 72.7
Wheel 72.1
Person 71.9
Car Wheel 67.1
Shorts 65.7
People 65.7
Bicycle 65.6
Bike 65.6
Tennis Racket 64.9
Racket 64.9
Face 63.2
Photography 61.4
Photo 61.4
Portrait 60.9
Pedestrian 56.6
Pants 56.5
Shoe 53.1

Clarifai
created on 2023-10-28

people 99.7
adult 96.8
monochrome 96.3
two 95.9
man 95
group together 94
vehicle 91.9
woman 91.6
street 88.8
outfit 88.4
leader 87.1
group 86.4
uniform 85.9
portrait 85.1
three 82.9
several 82.8
administration 82.7
wear 82.7
one 80.4
four 79.7

Imagga
created on 2022-02-26

person 29.9
man 29.6
people 29
adult 22
male 22
car 21.6
groom 21.2
cricket bat 19.2
cricket equipment 18.9
sport 18.6
happiness 16.4
clothing 16.1
equipment 15.8
couple 15.7
park 15.6
sports equipment 14.8
summer 14.8
uniform 14.5
outdoors 14.3
wedding 13.8
happy 13.8
bride 13.4
love 13.4
outdoor 13
sky 12.7
fun 12.7
two 12.7
vehicle 12.7
day 12.5
portrait 12.3
world 12.1
mask 11.5
life 11.1
grass 11.1
protection 10.9
danger 10.9
field 10.9
married 10.5
automobile 10.5
player 10.3
men 10.3
smiling 10.1
competition 10.1
playing 10
travel 9.9
auto 9.6
women 9.5
youth 9.4
gun 9.2
leisure 9.1
active 9.1
dress 9
suit 9
military 8.7
lifestyle 8.7
wife 8.5
motor vehicle 8.4
athlete 8.4
vacation 8.2
activity 8.1
businessman 7.9
limousine 7.9
business 7.9
standing 7.8
soldier 7.8
destruction 7.8
accident 7.8
play 7.8
beach 7.7
old 7.7
free 7.5
human 7.5
training 7.4
transport 7.3
sun 7.2
color 7.2
transportation 7.2
bright 7.1
smile 7.1
sunlight 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 96
text 95.6
land vehicle 91.9
car 91.6
vehicle 91.2
black and white 89.8
person 79
man 78.6
white 72.8
clothing 69.9
wheel 69.3
old 61
vintage 29.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 77.8%
Calm 45.3%
Confused 35.7%
Sad 10.1%
Disgusted 2.5%
Angry 2.2%
Surprised 1.7%
Fear 1.2%
Happy 1.2%

AWS Rekognition

Age 40-48
Gender Male, 66%
Calm 66.1%
Happy 24.6%
Sad 3.3%
Confused 1.9%
Surprised 1.7%
Angry 1.2%
Disgusted 0.9%
Fear 0.3%

AWS Rekognition

Age 22-30
Gender Male, 72.9%
Calm 72.1%
Sad 9.3%
Happy 6.1%
Fear 5%
Surprised 2.8%
Angry 2.1%
Disgusted 1.9%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Shoe
Wheel
Tie
Bicycle
Tennis Racket
Person 99.7%
Person 98.7%
Person 97.6%
Person 96.9%
Person 71.9%
Car 98.9%
Shoe 88.2%
Shoe 53.1%
Wheel 84.4%
Wheel 72.1%
Tie 72.7%
Bicycle 65.6%

Text analysis

Amazon

M H7
M H7 YT3RAS
YT3RAS

Google

M H3 YT33A2
M
H3
YT33A2