Human Generated Data

Title

Untitled (children sitting in small toy train ride)

Date

1954

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14913

Human Generated Data

Title

Untitled (children sitting in small toy train ride)

People

Artist: Jack Gould, American

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.6
Human 99.6
Person 99.1
Nature 98.3
Person 97.9
Person 97.3
Person 97.3
Outdoors 97.1
Person 97
Person 95.2
Person 94.7
Building 94.2
Poster 92.6
Advertisement 92.6
Countryside 91.3
Rural 87.1
Hut 87.1
Shack 85.1
Housing 82.6
Person 77.1
House 65.3
Shelter 60.1

Imagga
created on 2022-01-29

shop 67.2
barbershop 51.3
mercantile establishment 51.1
building 38.7
architecture 36.7
place of business 34.1
balcony 33.1
window 30.9
house 30.1
old 28.5
bakery 25.4
wall 23.9
city 22.4
structure 19.8
home 18.3
exterior 17.5
establishment 17
town 16.7
urban 16.6
glass 16.3
historic 15.6
history 15.2
ancient 14.7
street 13.8
windows 13.4
travel 13.4
roof 12.4
vintage 11.6
facade 11.6
tourism 11.5
stone 11
high 10.4
brick 10.4
residence 9.7
door 9.7
architectural 9.6
historical 9.4
aged 9
retro 9
sky 8.9
detail 8.8
light 8.7
residential 8.6
buildings 8.5
iron 8.4
people 8.4
decoration 8.2
interior 8
design 7.9
antique 7.8
culture 7.7
lamp 7.6
classic 7.4
landmark 7.2
holiday 7.2
colorful 7.2
wooden 7

Google
created on 2022-01-29

Window 93.5
Vertebrate 92
Mammal 85.4
House 85.3
Building 83.3
Cottage 74
Siding 73
Room 68.8
Monochrome 67.3
Log cabin 64.2
Illustration 63.9
Monochrome photography 62.4
Wood 59.3
History 58.8
Roof 57.8
Art 57.1
Shed 54.3
Photographic paper 50.6
Tree 50.3

Microsoft
created on 2022-01-29

building 99
house 96.5
text 90.5
outdoor 88.9
person 80.7
clothing 67.2

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Female, 96%
Calm 82.2%
Happy 11.7%
Surprised 4.3%
Sad 0.7%
Disgusted 0.5%
Confused 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Male, 96.8%
Happy 91.5%
Fear 3.7%
Sad 1.4%
Angry 1%
Calm 0.7%
Disgusted 0.7%
Surprised 0.6%
Confused 0.5%

AWS Rekognition

Age 27-37
Gender Female, 98.7%
Calm 69.8%
Happy 11.3%
Confused 7.5%
Sad 3.7%
Disgusted 3.3%
Surprised 1.8%
Angry 1.4%
Fear 1%

AWS Rekognition

Age 22-30
Gender Female, 97.9%
Happy 81.9%
Surprised 9.2%
Disgusted 3%
Confused 2.1%
Calm 1.9%
Angry 0.8%
Sad 0.8%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 92.6%

Captions

Microsoft

a group of people riding on the back of a house 74.7%
a group of people in front of a building 74.6%
a group of people standing in front of a building 74.5%

Text analysis

Amazon

FIC
SCO
PA
SCO E PA
E

Google

FIC
FIC