Human Generated Data

Title

Untitled (three men standing next to oil well)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2745

Human Generated Data

Title

Untitled (three men standing next to oil well)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2745

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.8
Human 99.8
Person 99.8
Person 99.4
Carpenter 95.1
Shoe 79.1
Footwear 79.1
Clothing 79.1
Apparel 79.1
Helmet 56.2

Clarifai
created on 2023-10-26

people 99.9
group together 98.9
group 97.4
man 97
adult 96.4
two 95.2
three 92.8
industry 92.5
construction worker 92.2
actor 91.7
many 91.6
vehicle 90
dig 83.8
monochrome 81.6
wear 80.6
transportation system 79.5
several 78.9
outfit 76.9
lid 75.8
five 74.5

Imagga
created on 2022-01-16

sky 21.1
volleyball net 20.5
sport 20.4
net 19.6
chairlift 18
conveyance 17.6
swing 17.5
shopping cart 17.5
man 16.8
beach 16.2
sunset 16.2
landscape 15.6
equipment 14.9
ski tow 14.9
mechanical device 14.6
handcart 14
outdoor 13.8
people 13.4
silhouette 13.2
game equipment 12.9
plaything 12.6
plow 12.4
tool 12.3
mechanism 11.9
old 11.8
sun 11.3
clouds 11
wheeled vehicle 10.7
male 10.6
structure 10.6
building 10.4
grass 10.3
sea 10.2
person 9.8
destruction 9.8
outdoors 9.7
summer 9.6
dusk 9.5
play 9.5
sunny 9.5
water 9.3
ocean 9.3
field 9.2
dark 9.2
black 9
vacation 9
metal 8.9
travel 8.5
protection 8.2
horizon 8.1
life 7.9
boy 7.8
nuclear 7.8
adult 7.8
rope 7.8
cloud 7.7
men 7.7
construction 7.7
outside 7.7
container 7.5
leisure 7.5
active 7.5
environment 7.4
freedom 7.3
business 7.3
recreation 7.2
sand 7.1
game 7.1
trees 7.1

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

text 98.6
outdoor 97.7
person 93.8
black and white 93.8
man 93.5
clothing 89.3
monochrome 58.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 66%
Fear 60.1%
Calm 21.3%
Disgusted 9.9%
Angry 2.7%
Sad 1.8%
Surprised 1.6%
Confused 1.5%
Happy 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 79.1%

Text analysis

Amazon

@

Google

KODVK-2EE
KODVK-2EE