Human Generated Data

Title

Untitled (boy standing with champion pig in Lockhart town square)

Date

1947

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2961

Human Generated Data

Title

Untitled (boy standing with champion pig in Lockhart town square)

People

Artist: Harry Annas, American 1897 - 1980

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2961

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Person 99.6
Human 99.6
Person 99.6
Person 99.1
Person 98.8
Pig 97.5
Mammal 97.5
Animal 97.5
Person 94.6
Person 90.7
Wheel 86.5
Machine 86.5
Car 84.9
Transportation 84.9
Vehicle 84.9
Automobile 84.9
Bull 83.8
Cattle 65.3
Buffalo 56.9
Wildlife 56.9
Person 55.6

Clarifai
created on 2023-10-26

people 99.9
canine 98.9
adult 98.2
monochrome 97.7
group 97.7
dog 97.3
man 95.7
group together 95.1
three 90.1
wear 89.7
woman 86.6
several 86.2
mammal 85.2
two 84.3
uniform 82.9
child 82.8
street 82.5
administration 80.7
portrait 80.3
retro 78.8

Imagga
created on 2022-01-21

hog 91.1
swine 80.5
ungulate 36.2
farm 23.2
pig 20.2
piggy 20.2
pink 17.5
mammal 15.8
people 15.6
piglet 15.3
man 14.1
pork 13.4
rural 12.3
water 12
snout 11.8
person 11.2
animals 11.1
agriculture 10.5
head 10.1
cute 10
male 9.9
ears 9.6
nose 9.5
senior 9.4
business 9.1
pigs 8.9
family 8.9
adult 8.6
travel 8.4
finance 8.4
future 8.4
sky 8.3
outdoors 8.2
vacation 8.2
happy 8.1
domestic 8.1
bank 8.1
smiling 7.9
livestock 7.9
life 7.8
portrait 7.8
ocean 7.5
investment 7.3
pen 7.3
looking 7.2
piggy bank 7.1
country 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

text 99.3
person 97.5
outdoor 94.2
animal 87.4
clothing 62.3
old 53.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 60.9%
Happy 83.2%
Calm 10.5%
Surprised 3.2%
Sad 1.1%
Angry 0.8%
Disgusted 0.4%
Confused 0.4%
Fear 0.4%

AWS Rekognition

Age 21-29
Gender Female, 59.4%
Calm 71.4%
Surprised 24%
Disgusted 1.1%
Happy 0.9%
Confused 0.9%
Sad 0.6%
Angry 0.6%
Fear 0.5%

AWS Rekognition

Age 27-37
Gender Male, 95.9%
Calm 91.7%
Sad 5.9%
Happy 1.2%
Confused 0.7%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 99.1%
Calm 32.1%
Sad 20.6%
Angry 16.4%
Confused 12.5%
Happy 12%
Fear 2.5%
Surprised 2.5%
Disgusted 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Pig 97.5%
Wheel 86.5%
Car 84.9%

Categories

Imagga

paintings art 97.4%
interior objects 2.4%

Text analysis

Amazon

EEI
ZOCKHART
YТ3-

Google

T33 KODVK- E
T33
E
KODVK-