Human Generated Data

Title

Untitled (family riding in convertible)

Date

1959

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17283

Human Generated Data

Title

Untitled (family riding in convertible)

People

Artist: Lucian and Mary Brown, American

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99
Human 99
Person 98.9
Person 98.8
Transportation 97.3
Vehicle 97.3
Automobile 97.3
Person 96.5
Convertible 94.7
Car 93.7
Person 93.4
Vegetation 91.2
Plant 91.2
Tree 85.2
Woodland 85.2
Forest 85.2
Nature 85.2
Outdoors 85.2
Grove 85.2
Land 85.2
Driving 70.2
Face 68.1
Hot Rod 66.3
People 64.5
Portrait 62.3
Photography 62.3
Photo 62.3
Sedan 61.5
Antique Car 59.1
Wheel 55.3
Machine 55.3

Imagga
created on 2022-02-26

car 100
vehicle 56.9
convertible 43.4
motor vehicle 42.1
transportation 42.1
automobile 41.1
auto 37.3
road 31.6
drive 31.2
transport 30.1
wheel 29.8
speed 29.3
driving 24.1
travel 21.1
sedan 21
motor 20.3
driver 18.4
wheeled vehicle 18.3
luxury 18
fast 16.8
seat 16.8
traffic 16.1
modern 15.4
mirror 15.2
device 14.9
support 14.7
highway 14.5
windowsill 14.4
street 13.8
new 12.9
groom 12.7
engine 12.5
race 12.4
metal 12.1
power 11.7
people 11.7
sill 11.5
sitting 11.2
motion 11.1
lifestyle 10.8
sky 10.8
sport 10.7
style 10.4
tire 10.3
happy 10
reflection 9.7
outdoors 9.7
person 9.6
black 9.6
windshield 9.6
urban 9.6
color 9.5
light 9.4
smile 9.3
metallic 9.2
adult 9
asphalt 8.9
technology 8.9
structural member 8.7
moving 8.6
elegant 8.6
design 8.4
outdoor 8.4
window 8.4
sports 8.3
fashion 8.3
inside 8.3
self-propelled vehicle 8.1
man 8.1
freight car 7.9
happiness 7.8
model 7.8
attractive 7.7
blurred 7.7
performance 7.7
casual 7.6
blur 7.4
screen 7.4
mechanical device 7.2
detail 7.2
building 7.2
shiny 7.1
windshield wiper 7.1
interior 7.1
summer 7.1
rumble 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

tree 99.6
outdoor 97.1
black and white 90
text 87
vehicle 83.7
car 82.8
land vehicle 82.7
transport 74.4
white 63.4
old 44.2

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 54.2%
Calm 97.7%
Sad 1%
Surprised 0.7%
Happy 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 74.6%
Calm 51.4%
Happy 44.8%
Sad 1%
Angry 0.7%
Confused 0.7%
Disgusted 0.6%
Surprised 0.6%
Fear 0.3%

AWS Rekognition

Age 23-33
Gender Female, 92.8%
Calm 30.7%
Sad 21.6%
Surprised 16.9%
Angry 8.9%
Happy 7.2%
Fear 6.2%
Confused 5.7%
Disgusted 2.8%

AWS Rekognition

Age 52-60
Gender Male, 99.8%
Calm 91.8%
Surprised 3.7%
Confused 1.8%
Happy 1.3%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%
Sad 0.1%

AWS Rekognition

Age 27-37
Gender Male, 79.8%
Happy 38.4%
Calm 34.8%
Surprised 16.9%
Fear 3.7%
Confused 2.5%
Disgusted 1.7%
Sad 1.1%
Angry 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Car 93.7%

Captions

Microsoft

a person sitting in front of a car 78.3%
a person sitting in a car 77%
a car parked on the side of a road 76.9%

Text analysis

Amazon

24
KODAK---ITW

Google

24
24