Human Generated Data

Title

Untitled (four men standing outside next to irrigation device)

Date

1952-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6349

Human Generated Data

Title

Untitled (four men standing outside next to irrigation device)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6349

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.8
Person 99.8
Person 99.6
Person 99.3
Person 96.6
Clothing 93.9
Apparel 93.9
Nature 81.7
Outdoors 80.5
People 75.6
Transportation 71.8
Vehicle 71.4
Shorts 68.9
Ground 63.8
Creme 60.1
Icing 60.1
Food 60.1
Dessert 60.1
Cake 60.1
Cream 60.1
Road 57.3
Military Uniform 56.5
Military 56.5

Clarifai
created on 2023-10-26

people 99.8
adult 98.7
group together 98.6
man 96.8
group 96.4
two 94
flame 93.6
three 90.4
vehicle 90.3
bucket 90.3
many 87.8
campsite 85.5
several 85.5
five 84.1
war 84
industry 82.9
woman 81.3
actor 80.5
soldier 79.5
mine 79.3

Imagga
created on 2022-01-22

shopping cart 44.6
handcart 38.4
wheeled vehicle 31
sky 26.4
container 25.9
landscape 22.3
sunset 21.6
man 21.5
water 20
beach 17.8
sea 17.2
outdoor 16.1
people 15.1
silhouette 14.9
summer 14.8
ocean 13.4
travel 13.4
male 12.8
old 12.5
dusk 12.4
person 12.4
sun 12.2
vacation 11.5
evening 11.2
conveyance 11
rural 10.6
clouds 10.1
sand 9.6
grass 9.5
building 9.4
lake 9.2
industrial 9.1
destruction 8.8
fog 8.7
happiness 8.6
cloud 8.6
tree 8.5
sunrise 8.4
adult 8.4
field 8.4
dark 8.4
life 8.3
environment 8.2
protection 8.2
horizon 8.1
vehicle 8.1
transportation 8.1
river 8
trees 8
fisherman 7.9
love 7.9
forest 7.8
boy 7.8
disaster 7.8
season 7.8
nuclear 7.8
industry 7.7
seascape 7.7
outdoors 7.6
climate 7.6
waves 7.4
tourism 7.4
road 7.2
black 7.2
coast 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 94.1
outdoor 93.1
building 82.1
black and white 73.8
person 54.5
old 43.9
window 16.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 96.8%
Sad 49.2%
Happy 19.8%
Calm 11.4%
Confused 8.4%
Angry 3.7%
Fear 3.6%
Surprised 2.7%
Disgusted 1.2%

AWS Rekognition

Age 24-34
Gender Male, 89.5%
Sad 52.5%
Happy 28.3%
Fear 8.8%
Calm 5.9%
Surprised 1.8%
Confused 1.6%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 36-44
Gender Female, 55.6%
Happy 70.4%
Calm 16.8%
Sad 4.9%
Surprised 2.9%
Fear 1.8%
Confused 1.6%
Angry 0.9%
Disgusted 0.7%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Calm 56.3%
Sad 29.5%
Happy 4.6%
Confused 4%
Fear 2.9%
Surprised 1.3%
Disgusted 0.8%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

TS3
KODAKSLA

Google

YT3RA2-XAGON
YT3RA2-XAGON