Human Generated Data

Title

Untitled (women on bikes in front of large house)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19414

Human Generated Data

Title

Untitled (women on bikes in front of large house)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Machine 99.9
Wheel 99.9
Bike 99.9
Vehicle 99.9
Transportation 99.9
Bicycle 99.9
Wheel 99.4
Wheel 99.2
Person 99.1
Human 99.1
Bicycle 98.6
Cyclist 88.3
Sport 88.3
Sports 88.3
Person 84.4
Person 83.9
Person 82.9
Spoke 76.4
Person 62.7
Apparel 58.3
Shorts 58.3
Clothing 58.3

Imagga
created on 2022-03-05

tricycle 99.4
wheeled vehicle 88.1
vehicle 61.7
conveyance 42.4
architecture 21.3
old 20.2
house 19.2
building 18.9
winter 17
bicycle 15.1
sky 14.7
rural 13.2
cold 12.9
travel 12.7
road 12.6
night 12.4
snow 12
transportation 11.6
tree 11.5
structure 11.1
city 10.8
transport 10
trees 9.8
country 9.7
street 9.2
outdoor 9.2
tourism 9.1
black 9
history 8.9
bike 8.8
hill 8.4
window 8.2
landscape 8.2
mountain 8.1
light 8
home 8
wheelchair 8
roof 8
facade 7.8
scene 7.8
sport 7.7
exterior 7.4
historic 7.3
people 7.2
landmark 7.2
chair 7.2
farm 7.1
grass 7.1
season 7

Google
created on 2022-03-05

Bicycle 97.8
Tire 97
Wheel 96.6
Window 94.3
Building 93.4
Bicycle wheel rim 91
Black 89.5
Vehicle 88.9
Motor vehicle 87.4
Bicycle tire 86.9
Bicycle wheel 85.7
Bicycle frame 85.1
House 83.4
Line 82.2
Facade 75.4
Snapshot 74.3
Door 69.7
Art 68.3
Room 68.2
Siding 67.9

Microsoft
created on 2022-03-05

outdoor 99.9
building 99.7
bicycle 96.2
text 95.1
bicycle wheel 85
land vehicle 80.6
wheel 79.8
vehicle 73.1
black and white 71.3
black 68.6
sports equipment 64.6
bike 62.3
posing 60.4
old 41.9

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Female, 91.6%
Fear 98.8%
Calm 0.4%
Surprised 0.3%
Disgusted 0.2%
Sad 0.2%
Happy 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 27-37
Gender Male, 80.2%
Happy 60.4%
Angry 15.8%
Calm 8.2%
Fear 7.9%
Sad 5.3%
Disgusted 1.4%
Surprised 0.7%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.9%
Bicycle 99.9%
Person 99.1%

Captions

Microsoft

a person with a bicycle in front of a house 90.6%
a person with a bicycle in front of a building 89.1%
a person standing in front of a building 89%

Text analysis

Amazon

1835
RACOX
YT3RAS RACOX
YT3RAS

Google

LEEEE
LEEEE