Human Generated Data

Title

Untitled (men working with carcasses at Kreuz meat market)

Date

1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2600

Human Generated Data

Title

Untitled (men working with carcasses at Kreuz meat market)

People

Artist: Harry Annas, American 1897 - 1980

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2600

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 97.8
Apparel 88.1
Clothing 88.1
Dog 70.1
Mammal 70.1
Animal 70.1
Canine 70.1
Pet 70.1
People 67.3
Building 66.8
Urban 66.1
Shorts 58.1
Outdoors 55.8

Clarifai
created on 2023-10-26

people 99.9
adult 98.6
man 97.2
two 96.2
vehicle 96
watercraft 95.3
group 94.9
group together 94.9
one 94.5
woman 94
wear 93.1
transportation system 89.6
administration 89.2
three 89
military 86.9
four 85.5
actor 84.2
furniture 81.7
commerce 80
industry 79.3

Imagga
created on 2022-01-15

loom 100
textile machine 88.3
machine 67.7
device 49.5
building 19
man 17.5
city 15
travel 14.8
water 14.7
sky 14.7
people 13.9
work 13.4
architecture 13.3
ship 13
boat 13
construction 12.8
modern 12.6
power 12.6
business 12.1
equipment 11.8
adult 11.7
worker 11.6
light 11.4
urban 11.4
industry 11.1
industrial 10.9
lifestyle 10.8
metal 10.5
high 10.4
male 9.9
steel 9.8
old 9.7
person 9.7
technology 9.6
vessel 9.5
bridge 9.5
sea 9.4
active 9.2
sport 9.2
craft 9.1
sailboat 8.8
manufacturing 8.8
looking 8.8
skyscraper 8.6
black 8.4
street 8.3
rigging 8.2
transportation 8.1
structure 8.1
tower 8
scene 7.8
labor 7.8
boats 7.8
portrait 7.8
full length 7.7
pirate 7.7
tall 7.5
fun 7.5
stage 7.4
holding 7.4
rope 7.4
reflection 7.3
night 7.1
job 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 88.8
black and white 75.4
person 67.3
clothing 65.9
ship 62.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 72.9%
Calm 100%
Sad 0%
Angry 0%
Confused 0%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

Feature analysis

Amazon

Person 99.5%
Dog 70.1%

Categories

Captions

Microsoft
created on 2022-01-15

a group of people standing next to a dog 28.9%

Text analysis

Amazon

EVELLA