Human Generated Data

Title

Untitled (people gathered around car)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16483

Human Generated Data

Title

Untitled (people gathered around car)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.3
Human 99.3
Person 98.7
Person 98.5
Person 98.3
Person 98.1
Person 97.8
Person 97.8
Person 97.6
Person 96.5
Person 96.4
Person 96.1
Person 95.7
Person 95.6
Person 92.8
Person 86.5
Car 74.3
Automobile 74.3
Transportation 74.3
Vehicle 74.3
People 70.6
Person 70.3
Person 70.1
Crowd 68.9
Person 64.1
Clothing 64.1
Apparel 64.1
Pedestrian 58.4
Person 46.1

Imagga
created on 2022-02-11

toaster 19.9
water 17.3
architecture 16.6
negative 16.4
kitchen appliance 16.2
modern 15.4
night 15.1
structure 14.4
interior 14.1
home appliance 14
light 14
film 13.9
travel 13.4
room 11.5
technology 11.1
city 10.8
grand piano 10.6
hotel 10.5
blackboard 10.3
glass 10.3
appliance 10.2
building 10.2
sky 10.2
photographic paper 10
hall 9.9
snow 9.8
urban 9.6
pool 9.6
design 9.6
piano 9.6
construction 9.4
house 9.2
transportation 9
empty 8.6
luxury 8.6
window 8.6
equipment 8.5
art 8.3
vacation 8.2
landscape 8.2
metal 8
lake 7.7
boat 7.6
floor 7.4
tourism 7.4
inside 7.4
car 7.3
indoor 7.3
futuristic 7.2
furniture 7.1
percussion instrument 7.1
river 7.1
cool 7.1
sketch 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 92
outdoor 89.1
black and white 72.6
tree 68.5
white 61.9

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 94.9%
Happy 44.1%
Calm 43%
Sad 7.6%
Surprised 1.4%
Angry 1%
Confused 1%
Fear 1%
Disgusted 0.9%

AWS Rekognition

Age 19-27
Gender Male, 57.3%
Happy 61.7%
Sad 9.9%
Calm 8.9%
Disgusted 6%
Fear 5%
Surprised 4.8%
Confused 2%
Angry 1.8%

AWS Rekognition

Age 22-30
Gender Male, 53.7%
Calm 43%
Sad 39.3%
Angry 6.6%
Disgusted 3.5%
Confused 2.4%
Surprised 2.4%
Happy 1.5%
Fear 1.3%

AWS Rekognition

Age 25-35
Gender Male, 93.6%
Happy 70%
Calm 20%
Fear 2.7%
Sad 2.4%
Surprised 1.9%
Angry 1.3%
Disgusted 0.9%
Confused 0.8%

AWS Rekognition

Age 23-33
Gender Male, 94.1%
Happy 86.9%
Calm 8.2%
Sad 1.6%
Angry 1.2%
Confused 0.7%
Surprised 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 18-24
Gender Female, 67.2%
Happy 97.9%
Calm 0.8%
Fear 0.6%
Sad 0.3%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-33
Gender Male, 86.9%
Happy 65.8%
Calm 18.7%
Fear 8.3%
Sad 3.7%
Surprised 1.2%
Angry 0.8%
Confused 0.7%
Disgusted 0.7%

AWS Rekognition

Age 13-21
Gender Male, 51%
Calm 42.7%
Fear 18.8%
Sad 16.9%
Happy 10.2%
Confused 4.1%
Surprised 3.2%
Angry 2.4%
Disgusted 1.7%

AWS Rekognition

Age 23-31
Gender Female, 71.5%
Calm 72%
Sad 14.7%
Happy 8.4%
Confused 2%
Angry 0.8%
Fear 0.8%
Disgusted 0.7%
Surprised 0.5%

AWS Rekognition

Age 26-36
Gender Male, 95.1%
Calm 82%
Sad 11%
Confused 1.9%
Angry 1.7%
Fear 1.1%
Happy 1.1%
Disgusted 0.7%
Surprised 0.4%

AWS Rekognition

Age 24-34
Gender Male, 96.8%
Calm 59.6%
Happy 9.4%
Disgusted 8.4%
Angry 7.5%
Sad 7%
Surprised 3.5%
Confused 2.5%
Fear 2.1%

AWS Rekognition

Age 53-61
Gender Male, 96.1%
Calm 77.1%
Happy 6%
Sad 5.4%
Fear 4.2%
Disgusted 3.6%
Confused 2%
Angry 1.2%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.3%
Car 74.3%

Captions

Microsoft

a vintage photo of a car 67.5%
a vintage photo of a truck 67.4%
a vintage photo of a parked car 57.6%

Text analysis

Google