Human Generated Data

Title

Untitled (adults and children sitting in car)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16889

Human Generated Data

Title

Untitled (adults and children sitting in car)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Automobile 98.8
Vehicle 98.8
Transportation 98.8
Human 98.5
Person 98.5
Convertible 98.3
Person 98.3
Person 98.3
Person 97.1
Car 96.5
Person 95.7
Plant 83.5
Vegetation 83.5
Hot Rod 75.3
Sedan 73.2
Tree 69.1
Grove 69.1
Nature 69.1
Outdoors 69.1
Land 69.1
Woodland 69.1
Forest 69.1
Antique Car 68.8
Driving 66.2
People 64.6
Bumper 59.2
Sports Car 57.7
Model T 56.5
Coupe 55.2

Imagga
created on 2022-02-26

windowsill 100
sill 94.1
structural member 70.6
car 51
support 45.4
vehicle 31.8
device 25.6
road 22.6
transportation 22.4
automobile 21
speed 20.1
auto 17.2
transport 16.4
driving 15.4
freight car 15.1
urban 14.8
modern 14.7
drive 14.2
travel 14.1
fast 13.1
street 12.9
architecture 12.5
sky 12.1
light 12
luxury 12
outdoors 11.9
interior 11.5
wheel 11.3
mirror 11.3
building 11.2
lifestyle 10.8
city 10.8
highway 10.6
wheeled vehicle 10.4
motion 10.3
black 10.2
reflection 9.8
traffic 9.5
color 9.4
day 9.4
window 9.2
fountain 9.2
snow 8.8
furniture 8.7
power 8.4
blur 8.4
landscape 8.2
technology 8.2
style 8.2
metal 8
design 7.9
art 7.8
glass 7.8
winter 7.7
outdoor 7.6
dark 7.5
screen 7.4
sports 7.4
ice 7.4
inside 7.4
water 7.3
new 7.3
people 7.2
detail 7.2
home 7.2
trees 7.1
night 7.1
summer 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

tree 99.8
outdoor 94.6
text 86.7
black and white 80.6
land vehicle 76.4
vehicle 74.8
car 72.3
sketch 53.7
old 48.9

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 82%
Calm 69.8%
Happy 23.9%
Sad 3%
Surprised 1.7%
Angry 0.6%
Disgusted 0.5%
Confused 0.4%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 97.2%
Happy 0.9%
Surprised 0.6%
Confused 0.4%
Fear 0.4%
Sad 0.3%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 49-57
Gender Male, 65.5%
Happy 35.2%
Calm 33.7%
Surprised 20.9%
Angry 6.8%
Fear 1.6%
Disgusted 0.8%
Confused 0.5%
Sad 0.5%

AWS Rekognition

Age 23-33
Gender Female, 99.6%
Calm 99.6%
Fear 0.1%
Surprised 0.1%
Happy 0%
Confused 0%
Sad 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 48-56
Gender Male, 66%
Calm 93.6%
Happy 5.5%
Surprised 0.4%
Sad 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.5%
Car 96.5%

Captions

Microsoft

a person sitting in front of a car 74.5%
a person sitting in a car 72.7%
a person standing in front of a car 72.6%

Text analysis

Amazon

5
26

Google

MJI7--YT
MJI7--YT 33A°2-- AGOX
33A°2--
AGOX