Human Generated Data

Title

Untitled (man posing with three women in front of large car at entrance to estate)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10032

Human Generated Data

Title

Untitled (man posing with three women in front of large car at entrance to estate)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.5
Human 99.5
Person 99.2
Person 99.1
Transportation 99
Vehicle 99
Automobile 99
Car 99
Person 98.2
Antique Car 94.1
Hot Rod 84.4
Tire 80.7
Wheel 76.4
Machine 76.4
Spoke 70.8
Sports Car 63.5
Car Wheel 63.2
Asphalt 61.2
Tarmac 61.2
Nature 61.1
Outdoors 60.1
Clothing 59.3
Apparel 59.3
Sedan 55.4

Imagga
created on 2022-01-28

world 29
man 25.5
people 20.1
male 19.9
wheeled vehicle 19.9
skateboard 17.5
person 17.1
adult 15.6
board 14.8
vehicle 14.5
water 13.3
car 13.2
black 12.6
silhouette 12.4
outdoor 12.2
men 12
transportation 11.6
business 11.5
park 11.5
light 11.2
night 10.7
women 10.3
day 10.2
travel 9.9
destruction 9.8
portrait 9.7
sky 9.6
boy 9.6
blackboard 9.5
smoke 9.3
speed 9.2
outdoors 9.1
danger 9.1
sport 9
human 9
urban 8.7
summer 8.4
city 8.3
leisure 8.3
street 8.3
holding 8.2
dirty 8.1
suit 8.1
symbol 8.1
lifestyle 7.9
businessman 7.9
motor vehicle 7.9
smile 7.8
scene 7.8
sign 7.5
dark 7.5
ocean 7.5
transport 7.3
protection 7.3
sunset 7.2
cool 7.1
happiness 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

tree 99.8
road 99.4
outdoor 98.3
text 98
vehicle 77.2
land vehicle 77.1
white 71.6
wheel 63.3
car 54.9
black and white 54.7
person 53.7
past 28.2

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 77.3%
Calm 86.5%
Sad 5.8%
Happy 5.3%
Confused 1.4%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 21-29
Gender Male, 86.4%
Calm 64.2%
Sad 21.4%
Confused 8.3%
Surprised 3.7%
Happy 0.7%
Angry 0.7%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 24-34
Gender Male, 69.1%
Calm 79%
Happy 16%
Sad 3.6%
Surprised 0.4%
Confused 0.4%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.5%
Car 99%
Wheel 76.4%

Captions

Microsoft

a person riding on the back of a truck 49.7%
a person sitting on the side of a road 49.6%
a person sitting on the side of the road 49.5%

Text analysis

Google

MJI7-- YT37A°2--AGON
YT37A°2--AGON
MJI7--