Human Generated Data

Title

Untitled (two photographs: two men boxing in arena with audience in background; still life of fruit branches on white paper on floor)

Date

1945-1955, printed later

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6687

Human Generated Data

Title

Untitled (two photographs: two men boxing in arena with audience in background; still life of fruit branches on white paper on floor)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1945-1955, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.8
Human 99.8
Person 99.3
Apparel 97.8
Footwear 97.8
Clothing 97.8
Shoe 97.8
Sports 93.5
Sport 93.5
Shorts 88.7
Shoe 68.8
Boxing 68.7
Shoe 55.7

Imagga
created on 2022-01-22

device 21.5
airplane 19.3
aircraft 15.8
plane 14.5
flight 14.4
sky 13.4
flying 13.2
jet 11.5
fly 11.2
vehicle 11.2
air 11
propeller 10.8
business 10.3
transport 10
city 10
male 9.9
aviation 9.8
seat 9.7
military 9.6
sport 9.4
skateboard 9.3
wing 8.9
man 8.7
skill 8.7
design 8.5
black 8.4
board 8.3
transportation 8.1
light 8
wheeled vehicle 7.9
day 7.8
people 7.8
fighter 7.8
outdoor 7.6
fashion 7.5
vintage 7.4
torpedo 7.3
sun 7.2

Google
created on 2022-01-22

Photograph 94.3
Window 92.3
White 92.2
Black 89.9
Shorts 88.8
Rectangle 86.4
Black-and-white 86.2
Gesture 85.3
Style 84.1
Art 83
Plant 82.6
Line 81.9
Font 79.5
Monochrome photography 77.7
Monochrome 76
Snapshot 74.3
Room 69
Symmetry 67.1
Stock photography 66.5
Visual arts 64.5

Microsoft
created on 2022-01-22

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 86.1%
Calm 93.6%
Sad 5.4%
Confused 0.3%
Angry 0.3%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Female, 76.6%
Calm 91.6%
Sad 5.5%
Fear 1%
Happy 0.9%
Disgusted 0.3%
Angry 0.3%
Surprised 0.3%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 97.8%

Captions

Microsoft

a person jumping in the air 70.3%
a person jumping up in the air 65.7%
a person doing a trick on a skateboard 26.9%