Human Generated Data

Title

Untitled (couple with young boy examining toy boat in Christmas living room)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9153

Human Generated Data

Title

Untitled (couple with young boy examining toy boat in Christmas living room)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9153

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.5
Person 97.5
Clothing 91.4
Apparel 91.4
Wheel 89
Machine 89
Sunglasses 87
Accessories 87
Accessory 87
Transportation 68.7
Meal 68.6
Food 68.6
Vehicle 67.6
People 64.8
Female 64.7
Urban 63.8
Car 63.3
Automobile 63.3
Person 63
Face 62.3
Photography 61.2
Photo 61.2
Sitting 57.9
Poster 55.8
Advertisement 55.8
Building 55.2
Wheel 50.8

Clarifai
created on 2023-10-26

people 100
adult 99.2
vehicle 98.6
transportation system 97.6
child 97.4
two 97.3
group 96.7
man 96.5
group together 95.6
woman 94.3
monochrome 93.8
recreation 92.7
three 91.7
four 87.3
offspring 86.6
sitting 85.5
several 83.2
many 79.4
driver 78.1
war 78

Imagga
created on 2022-01-23

man 31.6
male 29.8
person 24.8
people 23.4
men 18
adult 15
businessman 15
portrait 13.6
barbershop 13.4
seller 12.8
shop 12.5
couple 12.2
business 12.1
worker 12
uniform 11.5
happiness 11
musical instrument 10.5
old 10.4
sitting 10.3
room 10.2
happy 10
kin 9.8
family 9.8
women 9.5
vehicle 9.4
work 9.4
smile 9.3
outdoor 9.2
home 8.8
pedestrian 8.7
military 8.7
smiling 8.7
love 8.7
day 8.6
mercantile establishment 8.6
black 8.4
patient 8.3
traditional 8.3
danger 8.2
clothing 8.1
suit 8.1
to 8
job 8
world 7.8
soldier 7.8
groom 7.8
war 7.7
industry 7.7
casual 7.6
horizontal 7.5
house 7.5
outdoors 7.5
gun 7.5
holding 7.4
water 7.3
building 7.3
cheerful 7.3
new 7.3
industrial 7.3
group 7.2
dirty 7.2
lifestyle 7.2
transportation 7.2
team 7.2
car 7.1
percussion instrument 7
travel 7
architecture 7
indoors 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.1
person 96.9
outdoor 88.3
clothing 79.9
man 77.3
black and white 67.8
vehicle 63.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 9-17
Gender Female, 88.7%
Sad 97%
Calm 2.4%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 33-41
Gender Female, 88.8%
Happy 55.6%
Calm 38%
Sad 4.9%
Confused 0.8%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 41-49
Gender Male, 72.5%
Calm 98.3%
Sad 1.6%
Confused 0%
Disgusted 0%
Happy 0%
Angry 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Wheel 89%

Text analysis

Amazon

бигоо
13150
COLELA
бигоо COLELA LIFE
LIFE